Sr. Data Architect /modeler Resume
Owing Mills, MD
SUMMARY:
- Highly effective Data Architect with Around 7 years of experience specializing in working with big data, cloud, data and analytics platforms.
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Excellent experience on Teradata SQL queries, Teradata Indexes, Utilities such as Mload, Tpump, Fast load and Fast Export.
- Expert in writing SQL queries and optimizing the queries in Oracle, SQL Server 2008 and Teradata.
- Experience in Architecture, Design and Development of large Enterprise Data Warehouse (EDW) and Data - marts for target user-base consumption.
- Experienced in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, COBOL, XML and Flat Files to Netezza database.
- Having good knowledge of Hive, Sqoop, MR, Storm, Pig, HBase, Flume, Spark.
- Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
- Data Warehousing: Full life-cycle project leadership, business-driven requirements, capacity planning, gathering, feasibility analysis, enterprise and solution architecture, design, construction, data quality, profiling and cleansing, source-target mapping, gap analysis, data integration/ETL, SOA, ODA, data marts, Inman/Kimball methodology, Data Modeling for OLTP, canonical modeling, Dimension Modeling for data ware house star/snowflake design.
- Good experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS), Cognosand Business Objects.
- Experience in BI/DW solution (ETL,OLAP, Data mart), Informatica, BI Reporting tool like Tableau and Qlikview and also experienced leading the team of application, ETL, BI developers, Testing team .
- Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
- Experienced in big data analysis and developing data models using Hive, PIG, and Map reduce, SQL with strong data architecting skills designing data-centric solutions.
- Experience in writing expressions in SSRS and Expert in fine tuning the reports. Created many Drill through and Drill Down reports using SSRS.
- Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
- Experience in Big Data Hadoop Ecosystem in ingestion, storage, querying, processing and analysis of big data.
- Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
- Extensive experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
- Experienced in Batch processes, Import, Export, Backup, Database Monitoring tools and Application support.
- Good understanding and hands on experience with AWS S3 and EC2.
- Good experience on programming languages Python, Scala.
- Business Intelligence: Requirements analysis, Key Performance Indicators (KPI), metrics development, sourcing and gap analysis, OLAP concepts and methods, aggregates / materialized views and performance, rapid prototyping, tool selection, semantic layers Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
- Worked on Informatica Power Center tools-Designer, Repository Manager, Workflow Manager.
- Logical and physical database designing like Tables, Constraints, Index, etc. using Erwin,
- Excellent knowledge in preparing required project documentation and tracking and reporting regularly on the status of projects to all project stakeholders
- Extensive ETL testing experience using Informatica 9x/8x, Talend, Pentaho.
- Skillful in Data Analysis using SQL on Oracle, MS SQL Server, DB2 & Teradata.
- Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.
- Heavy use of Access queries, V-Lookup, formulas, Pivot Tables, etc. Working knowledge of CRM Automation Salesforce.com, SAP.
- Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
- Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
- Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.
- Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
- Assist in creating communication materials based on data for key internal /external audiences.
TECHNICAL SKILLS:
Data Modeling Tools: Erwin R6/R9, IBM Infosphere Data Architect, ER Studio and Oracle Designer.
Database Tools: Microsoft SQL Server12.0, Teradata 15.0, Oracle 11g/9i/12c and MS Access.
ETL/Data warehouse Tools:: Informatica 9.6/9.1/8.6.1/8.1 , SAP Business Objects XIR3.1/XIR2, Web Intelligence, Talend, Tableau 8.2, Pentaho.
BI Tools: Tableau 7.0/8.2, Tableau server 8.2, Tableau Reader 8.1,SAP Business Objects, Crystal Reports Packages: Microsoft Office 2010, Microsoft Project 2010, SAP and Microsoft Visio, Share point Portal Server.
Project Execution Methodologies:: Agile, Ralph Kimball and BillInmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD).
Operating System:: Windows, Unix, Sun Solaris.
RDBMS: Microsoft SQL Server14.0, Teradata 15.0, Oracle 12c/10g/11g /9i, and MS Access.
PROFESSIONAL SUMMARY:
Confidential, Owing Mills, MD
Sr. Data Architect /Modeler
Responsibilities:
- Consulted and supported Data Architect / Data Modeler initiatives in the development of integrated data repository transformed from legacy system to new operational system and data warehouse.
- Providing solutions on ingesting the data into the new Hadoop big data platform by designing data models for multiple features to help analyze the data on graph databases.
- Applied business rules in modeling Data Marts and data profiling to model and new data structures.
- Delivered scope, requirements, and design for transactional and data warehouse system which included Oracle DB, SQL server, and Salesforce database.
- Performed importing data from various sources to the Cassandra cluster using Python APIs.
- Analyzed the source data and worked with the Data Architect in designing and developing the logical and physical data models for the Enterprise Data Warehouse.
- Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
- Worked on Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, OLAP, OLTP, Star Schema, Snow Flake Schema, Fact Table and Dimension Table.
- Worked on detailed ER Diagrams which are built form client’s requirement logic.
- Propose, design and supported ETL implementation using Teradata Tools and Technology like BTEQ, MLOAD, FASTLOAD, FASTEXPORT, SQL ASSISTANT, and Teradata Parallel Transporter (TPT).
- Responsible for Big data initiatives and engagement including analysis, brainstorming, POC.
- Implemented Data Governance and Data Quality standards
- Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
- Worked on detailed ER Diagrams which are built form client’s requirement logic.
- Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS.
- Designed and developed AWS Cloud solutions for data and analytical workloads such as warehouses, Big Data, data lakes, real-time streams and advanced analytics.
- Used SSRS to create reports, customized Reports, on-demand reports, ad-hoc reports and involved in analyzing multi-dimensional reports in SSRS.
- Data modeling, Design, implement, and deploy high-performance, custom applications at scale on Hadoop /Spark.
- Designed the Logical Data Model using ERWIN 9.64 with the entities and attributes for each subject areas.
- Integrated data from different data warehouses using Taland Open Studio.
- Utilized various Transformations in Mappings like Joiner, Aggregate, Union, SQL, XML Parser, Expression, Lookup, Filter, Update Strategy, Stored Procedures, and Router etc
- Developed long termdata warehouse roadmap and designs and builds the data warehouse framework per the roadmap.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
- Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive and Informatica.
- Worked with Netezza and Oracle databases and implemented various logical and physical data models for them.
- Applied data analysis, data mining and data engineering to present data clearly.
- Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
- Designed and developed data for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
- Specifies overall Data for all areas and domains of the enterprise, including Data Acquisition, ODS, MDM, Data Warehouse, Data Provisioning, ETL, and BI.
- Demonstrated expertise utilizing SQL Server Integration Services (SSIS), Data Transformation Services (DTS), and Data Stage and ETL package design, and RDBM systems like SQL Servers, Oracle, and DB2.
- Advises/leads projects involving the ETL related activities and the migration or conversion of data between enterprise data systems. Coordinates interactions between central IT, business units, and data stewards to achieve desired organizational outcomes.
- Gathered and analyzed existing physical data models for in scope applications and proposed the changes to the data models according to the requirements.
- Advises on and enforces data governance to improve the quality/integrity of data and oversight on the collection and management of operational data.
- Responsible for planning, installing, and supporting AWS infrastructure.
- Proficiency in SQL across a number of dialects (we commonly write MySQL, PostgreSQL, Redshift, SQL Server, and Oracle).
- Experience with AWS ecosystem (EC2, S3, RDS, Redshift).
- Integrated crystal reports using Erwin Data Modeler.
- Us Erwin to support for Teradata 13.0 and SSL.
- Involved in designing Logical and Physical data models for different database applications using the Erwin.
- Ensured high-quality data and understand how data is generated out experimental design and how these experiments can produce actionable, trustworthy conclusions.
Environment: Erwin 9.6, Oracle 12c, MS-Office, SQL, TOAD Benchmark Factory, SQL Loader, PL/SQL, DB2, SharePoint, ERwin r9.64, Talend, MS-Office, Redshift,SQL Server, Hadoop, Spark, AWS.
Confidential, San Antonio, TX
Sr. Data Analyst/Modeler
Responsibilities:
- Designed and build relational database models and defines data requirements to meet the business requirements.
- Transformed Logical Data Model to Physical Data Model ensuring the Primary Key and Foreign key relationships in PDM, Consistency of definitions of Data Attributes and Primary Index considerations.
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects XIR2.
- Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects and identified the Facts and Dimensions from the business requirements and developed the logical and physical models using Erwin.
- Developed and deployed Data Warehouse infrastructure using SSIS for Data integration and SSRS to automate reports generations.
- Developed strategies for data acquisitions, archive recovery, and implementation of databases.
- Extensively used Erwin r9.6 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
- Analyzed existing SSIS package, make changes to improve its performances, add standard logging and configuration system.
- Created logical data model from the conceptual model and it's conversion into the physical database design using Erwin r9.6.
- Worked with Architecture team to get the metadata approved for the new data elements that are added for this project.
- Extensively used Erwin r9.6 for Data modeling. Created Staging and Target Models for the Enterprise Data Warehouse.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
- Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
- Creation of BTEQ, Fast export, Multi Load, TPump, Fast load scripts for extracting data from various production systems
- Owned and managed all changes to the data models. Created data models, solution designs and data architecture documentation for complex information systems.
- Performed data cleaning and data manipulation activities using NOSQL utility.
- Created Schema objects like Indexes, Views, and Sequences, triggers, grants, roles, Snapshots.
- Worked with reverse engineering Data Model from Database instance and Scripts.
- Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.
- Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
- Implemented Data Vault Modeling Concept solved the problem of dealing with change in the environment by separating the business keys and the associations between those business keys, from the descriptive attributes of those keys using HUB, LINKS tables and Satellites.
- Worked on AWS Redshift and RDS for implementing models and data on RDS and Redshift.
- Gather and analyze business data requirements and model these needs. In doing so, work closely with the users of the information, the application developers and architects, to ensure the information models are capable of meeting their needs.
- Designed and Developed Oracle PL/SQL Procedures and UNIXShell Scripts for Data Import/Export and Data Conversions.
Environment: AWS, RDS, Big Data, JDBC, Cassandra, NOSQL, Spark, Scala, Python, Hadoop, MySQL, PostgreSQL, SQL Server, Erwin, Informatica .
Confidential, Minneapolis, MN
Sr. Data Analyst/Modeler
Responsibilities:
- Developed the design & Process flow to ensure that the process is repeatable.
- Participated in performance management and tuning for stored procedures, tables and database servers.
- Performed Hive programming for applications that were migrated to big data using Hadoop.
- Developed the design & Process flow to ensure that the process is repeatable.
- Generated comprehensive analytical reports by running SQL queries against current databases to conduct Data Analysis.
- Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including Oracle and Teradata.
- Performed analysis of the existing source systems (Transaction database).
- Built reports using SSRS using complex stored proceduresas a sources.
- Participated in performance management and tuning for stored procedures, tables and database servers.
- Create Logical Data Model for Staging, ODS and Data Mart and Time dimension as well.
- Involved in maintaining and updating Metadata Repository with details on the nature and use of applications/data transformations to facilitate impact analysis.
- Extensively used SQL, Transact SQL and PL/SQL to write stored procedures, functions, packages and triggers.
- Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
- Worked on the integration of existing systems at Data warehouse and Application systems level.
- Extensively used ER Studio for developing data model using star schema and Snowflake Schema methodologies.
- Created Design Fact & Dimensions Tables, Conceptual, Physical and Logical Data Models using Embarcadero ER Studio.
- Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
- Created DDL scripts using ER Studio and source to target mappings to bring the data from source to the warehouse.
- Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata.
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flatfiles, MS SQL Server with high volume data.
- Extensively used Agile methodology as the Organization Standard to implement the data Models.
- Identify, assess and intimate potential risks associated to testing scope, quality of the product and schedule
- Finalize the naming Standards for Data Elements and ETL Jobs and create a Data Dictionary for Meta Data Management.
- Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
- Produced PL/SQL statement and stored procedures in DB2 for extracting as well as writing data.
- Designed Logical & Physical Data Model /Metadata/ data dictionary using Erwin for both OLTP and OLAP based systems.
- Co-ordinate all teams to centralize Meta-data management updates and follow the standard Naming Standards and Attributes Standards for DATA &ETL Jobs.
- Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2005 with high volume data
- Worked extensively on ER Studio for multiple Operations across Atlas Copco in both OLAP and OLTP applications.
Environment: ER Studio, SQL Server 2008, SQL Server Analysis Services, SSIS, Oracle, Business Objects XI, Rational Rose, Data stage, MS Office, MS Visio, SQL, SQL Server, Rational Rose, Crystal Reports 9
Confidential, Chicago, IL
Sr. Data Analyst/Modeler
Responsibilities:
- Generated a separate MRM document with each assignment and shared it on SharePoint along with the PDF of updated data models.
- Created a list of domains in Erwin and worked on building up the data dictionary for the company.
- Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
- Generated a separate MRM document with each assignment and shared it on SharePoint along with the PDF of updated data models
- Created a list of domains in Erwin and worked on building up the data dictionary for the company
- Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
- Designed and developed stored procedures, queries and views necessary to support SSRS reports.
- Load data from MS Access database to SQL Server usingSSIS (creating staging tables and then loading the data).
- Performed Data Analysis and Data validation by writing SQL queries and Regular expressions.
- Created reports using SQL Server Reporting Services (SSRS).
- Involved in Regression, UAT and Integration testing.
- Created DDL scripts for implementing Data Modeling changes. Created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.
- Developed SSIS Packages to transfer the data between SQL Server database and files.
- Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
- Extensively used Erwin as the main tool for modeling along with Visio
- Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
- Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
- Worked on data mapping process from source system to target system. Created dimensional model for the reporting system by identifying required facts and dimensions using Erwin
- Designed Logical Data Models and Physical Data Models using Erwin.
- Developed the Conceptual Data Models, Logical Data models and transformed them to creating schema using ERWIN.
- Worked very close with Data Architectures and DBA team to implement data model changes in database in all environments.
- Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
- Developed enhancements to Mongo DB architecture to improve performance and scalability.
- Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
- Performed data cleaning and data manipulation activities using NZSQL utility.
- Analyzed the physical data model to understand the relationship between existing tables. Cleansed the unwanted tables and columns as per the requirements as part of the duty being a Data Analyst.
Environment: Erwin r8.2, Oracle SQL Developer, Oracle Data Modeler, Teradata 14, SSIS, Business Objects, SQL Server, ER/Studio Windows XP, MS Excel.
Confidential
Data Analyst
Responsibilities:
- Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Attended and participated in information and requirements gathering sessions.
- Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
- Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
- Generated comprehensive analytical reports by running SQL queries against current databases to conduct Data Analysis.
- Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL.
- Designed new application logical/physical data models in SAP Power Designer.
- Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
- Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
- Generated DDL using SAP Power Designer and loaded them in the Data Warehouse.
- Built reports using SSRS. Developed efficient reporting solutions. Set up subscriptions as needed for SSRS.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN
- Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
- Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
- Responsible for Relational data modeling (OLTP) using MS Visio (Logical, Physical and Conceptual).
- Extensively used reverse engineering feature of Erwin to save the data model with production.
- Created business requirement documents and integrated the requirements and underlying platform functionality.
- Excellent knowledge and experience in Technical Design and Documentation.
- Used forward engineering to create a physical data model with DDL that best suits the requirements from the Logical Data Model.
- Involved in preparing the design flow for the Data stage objects to pull the data from various upstream applications and do the required transformations and load the data into various downstream applications
- Performed logical data modeling, physical data modeling (including reverse engineering) using the Erwin Data Modeling tool.
Environment: Oracle9i, PL/SQL, Solaris 9/10, Windows Server. NZSQL, Erwin 8.0, ER- Studio 6.0/6.5, Toad 8.6, Informatica 8.0, IBM OS 390(V6.0), DB2 V7.1