Sr. Data Architect/data Modeler Resume
Minneapolis, MN
SUMMARY:
- Over 9+ years of experience in Oracle/Data Architect /Data Modeler & RAC Oracle Database Administration with extensive data architecture, database development and administration experience in full SDLC, designing and building relational databases in OLTP, OLAP, decision support (DSS), client/server, data warehouse and multi - tier web based projects.
- Experience in Agile methodology/Scrum and water fall models of DW complete life cycle projects
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and from RDBMS to HDFS.
- Expertise in developing solutions around NOSQL databases like MongoDB and HBase.
- Extensive expertise in Data Warehousing on different database (s), as well as data modeling, both logical and physical data modeling tools like Erwin, Power Designer and ER Studio.
- Skills in scientific application design and development and scientific data processing related technologies: VBA, Pipeline Pilot, R, Python, or web service integrations.
- Profound experience on SSIS, SSRS, SSAS deployment's and maintenance of L1/L3 support with required service and proxies and granular level permissions of regional security levels.
- Strong Data modeling experience using ER diagram, Dimensional data modeling, Conceptual/ Logical/ Physical Modeling using 3NormalForm (3NF), Star Schema modeling, Snowflake modeling using tools like Erwin, ER-Studio, SAP Power Designer.
- Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files, to Netezza database.
- Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
- Data Warehousing: Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source- Confidential mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house star/snowflake design.
- Experience in BI/DW solution (ETL, OLAP, Data mart), Informatica, BI Reporting tool like Tableau and Qlikview and also experienced leading the team of application, ETL, BI developers, Testing team
- Experienced in Technical consulting and end-to-end delivery with data modeling, data governance and design - development - implementation of solutions.
TECHNICAL SKILLS:
- Big Data technologies: MapReduce, HBase, HDFS, Sqoop, Spark, Hadoop, Hive, PIG, Impala.
- Data Modeling Tools: ER/Studio 9.7/9.0, Erwin 9.6/9.5, Power Sybase Designer.
- OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9/7
- Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED, Scala, Python
- Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server DB2.
- Testing and defect tracking Tools:: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe
- Operating System: Windows, UNIX, Sun Solaris
- ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, Tableau, Pentaho.
- Methodologies: Agile, RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Ralph Kimball and Bill Inmon's, Waterfall Model.
PROFESSIONAL EXPERIENCE:
Confidential, Minneapolis MN
Sr. Data Architect/Data Modeler
Responsibilities:
- Helped the Client solve critical business needs by modeling and maintaining the business intelligence database environments and researched, evaluated, architect, and deployed new tools, frameworks and patterns to build sustainable Big Data platforms for our clients.
- Designed and architecting AWS Cloud solutions for data and analytical workloads such as warehouses, Big Data, data lakes, real-time streams and advanced analytics.
- Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies and implemented Agile Methodology for building Integrated Data Warehouse, involved in multiple sprints for various tracks throughout the project lifecycle.
- Develop and Implement End to End Big Data Analytic and EDW/BI projects using multiple tools and platform which include.
- Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
- Worked on Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon's Data Warehouse methodology.
- Developed prototype solutions to verify capabilities for new systems development, enhancement, and maintenance of MDM.
Environment: Erwin r9.6/7, Netezza, SQL Server 2016, Informatica 10.2, Taradata15, PowerBI, OLAP, OLTP, UNIX, MDM, Hadoop, Hive, Pig, Salesforce.com, HBase, HDFS, SAP, AWS, Redshift, EMR, S3, Apache Flume, Ralph Kimball and Bill Inmon's, PL/SQL, BTEQ, Python.
Sr. Data Architect/Data Modeler
Confidential, Chicago IL
Responsibilities:
- Collaborate in identifying the current problems, constraints and root causes with data sets to identify the descriptive and predictive solution with support of the Hadoop HDFS, MapReduce, Pig, Hive, and Hbase and further to develop reports in Tableau.
- Massively involved in Data Architect role to review business requirement and compose source to Confidential data mapping documents and involved in relational and dimensional Data Modeling for creating Logical and Physical design of the database and ER diagrams using data modeling like Erwin.
- Created the data model for the Subject Area in the Enterprise Data Warehouse (EDW) and applied Data Governance rules (primary qualifier, class words and valid abbreviation in Table name and Column names)
- Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
- Involved in several facets of MDM implementations including Data Profiling, Metadata acquisition and data migration.
- Worked on NoSQL databases including HBase, Mongo DB, and Cassandra. Implemented multidata center and multi-rack Cassandra cluster.
- Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture
Environment: ERWIN r9.5, Congos, SQL Server2016, DB2, SSIS, OLAP, OLTP, LINUX, MDM, Hadoop, Hive, Pig, HBase, SAP, AWS, Redshift, PL/SQL, ETL, MondoDB, AWS S3, Informatica 9.6, Salesforce.com, Power BI, AWS EMR, SQL, Teradta, Netezza, Oracle and SSRS.
Sr. Data Modeler /Data Analyst
Confidential, Saint Cloud, MN
Responsibilities:
- Designed the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Business Units in tables.
- Gathered Business requirements by organizing and managing meetings with business stake holders, Application architects, Technical architects and IT analysts on a scheduled basis.
- Handled data from many sources using Excel and SQL queries and managed internal and external reporting requests while helping automate processes using data from multiple sources to an internal data warehouse.
- Assisted in data review of service failure investigations, providing this information to the Quality Manager for processing.
- Created DDL scripts using ER Studio and source to Confidential mappings to bring the data from source to the warehouse and developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
Environment: ER Studio, OOD, OLAP, OLTP, Teradata 13, MS Excel, Oracle 10g, SQL, PowerBI, PL/SQL, MS Visio, SQL*Loader, Netezza, SQL.