We provide IT Staff Augmentation Services!

Sr. Data Architect/data Modeler Resume

2.00/5 (Submit Your Rating)

Union County, NJ

SUMMARY

  • Over 11+ years experience in Data Architectural Designing, their Modeling as well as their Analysis with excellent understanding of Data Warehouse, Databases, Data Governance and Data Mart designing.
  • Experienced in Database Creation and maintenance of physical data models with Oracle, Teradata, Netezza, DB2 and SQL Server databases.
  • Experienced in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling and Multidimensional Data Modeling Schema (Star schema, Snow - Flake Modeling, Facts and dimensions)
  • Extensive experience in Normalization (1NF, 2NF, 3NF and BCNF) and De-normalization techniques for improved database performance in OLTP and Data Warehouse/Data Mart environments.
  • Experienced in multiple data sources and targets including Oracle, Netezza, DB2, SQL Server, XML and flat files.
  • Expertise in designing the data warehouse using Ralph Kimball's and Bill Inmon techniques.
  • Experienced in Netezza Administration Activities like backup/restore, performance tuning, and Security configuration.
  • Experienced in Data Analysis on Oracle, MS SQL Server & MS Access with extraction of data from various database sources like Oracle, MS SQL Server, DB2 and Flat files into the Data Stage.
  • Experienced in metadata from diverse sources, including relational databases Oracle, Teradata, Netezza, XML and flat files.
  • Experienced in working with Teradata Utilities like Fast load, Multi load, Tpump and Fast Export Teradata Query Submitting and processing tools like BTEQ and Teradata SQL Assistant (Queryman)
  • Experienced in SSIS programming module to code several SSIS packages by using many programming languages.
  • Experienced with Data Conversion, Data Quality, and Data Profiling, Performance Tuning and System Testing and implementing RDBMS features.
  • Experienced in Client-Server application development using Oracle PL/SQL, SQL PLUS, SQL Developer, TOAD, SQL LOADER.
  • Well Versed with advance concept in Excel worked with Vlookup, Index, Match, IF Statements, Pivots and creating complex formulas
  • Extensive experience in SSIS Packages, SSRS reports and SSAS cubes on production server.
  • Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC) as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
  • Experienced in ETL process using PL/SQL to populate the tables in OLTP and OLAP Data Warehouse Environment.
  • Expertise in SQL Server Analysis Services (SSAS) to deliver Online Analytical Processing (OLAP) and data mining functionality for business intelligence applications.

TECHNICAL SKILLS

Data Modeling Tools: Erwin 9.8/9.6/9.5/9. x/8.x, ER Studio and Oracle Designer.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9.

Oracle: Oracle 12c/11g/10g/9i R2 database servers with RAC, ASM, Data Guard, Grid Control and Oracle Golden Gate (Oracle Enterprise Manager), Oracle Data Guard, SQL Loader and SQL*PLUS

ETL Tools: SSIS, Datastage, Informatica Power Center 9.7/9.6 /9.5/ 9.1.

Programming Languages: SQL, T-SQL, HTML, Java Script, CSS, UNIX shells scripting, PL/SQL.

Database Tools: Microsoft SQL Server 2016/2015/2014 Teradata and MS Access, Postger SQL, Netezza, SQL Server, Oracle.

Web technologies: HTML, DHTML, XML, JavaScript

Reporting Tools: Business Objects, Crystal Reports

Operating Systems: Microsoft Windows 9x / NT / 2000/XP / Vista/7 and UNIX Windows 98, 95, Windows NT, Windows XP, 7.

Tools: & Software: TOAD7.1/6.2, MS Office, BTEQ, Teradata SQL Assistant

Big Data: Hadoop, HDFS 2, Hive, Pig, H Base, Sqoop, Flume.

Other tools: TOAD, SQL *PLUS, SQL*LOADER, MS Project, MS Visio and MS

PROFESSIONAL EXPERIENCE

Confidential, Union County, NJ

Sr. Data Architect/Data Modeler

Responsibilities:

  • Provided Data architecture support to enterprise data management efforts, such as the development of the enterprise data model and master and reference data, as well as support to projects, such as the development of physical data models, data warehouses and data marts.
  • Worked on Dimensional and Relational Data Modeling using Star and Snowflake Schemas, OLTP/OLAP system, Fact and Dimension tables, Conceptual, Logical and Physical data modeling using Erwin r9.6.
  • Gathered business requirements, working closely with business users, project leaders and developers.
  • Designed score validations by various risk scores, product development and portfolio profiling; develop strategic risk and trend analysis.
  • Analyzed the business requirements and designed conceptual and logical data models.
  • Lead the strategy, architecture and process improvements for data architecture and data management, balancing long and short-term needs of the business.
  • Builded relationships and trust with key stakeholders to support program delivery and adoption of enterprise architecture.
  • Provided technical leadership, mentoring throughout the project life-cycle, developing vision, strategy, architecture and overall design for assigned domain and for solutions.
  • Involved in T-SQL queries and optimizing the queries in Oracle 12c and Netezza.
  • Created MDM, OLAP data architecture, analytical data marts, and cubes optimized for reporting.
  • Involved in Logical modeling using the Dimensional Modeling techniques such as Star Schema and Snow Flake Schema.
  • Involved in Normalization and De-Normalization of existing tables for faster query retrieval.
  • Developed LINUX Shell scripts by using NZSQL/NZLOAD utilities to load data from flat files to Netezza database.
  • Created SSIS Reusable Packages to extract data from Multi formatted Flat files, Excel, XML files into UL Database and Billing Systems.
  • Performed data analysis, statistical analysis, generated reports, listings and graphs using SAS tools, SAS Integration Studio, SAS/Graph, SAS/SQL, SAS/Connect and SAS/Access.
  • Worked in importing and cleansing of data from various sources like Teradata15, Oracle, flat files, SQL Server with high volume data.
  • Involved in creating informatica mapping to populate staging tables and data warehouse tables from various sources like flat files, Netezza and oracle sources.
  • Developed Data Mapping, Data profiling, Data Governance and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Analyze source systems fordataacquisition.Architectdatafeathering logic across different source systems.
  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
  • Managed definition and execution of data mapping, conversion and reconciliation processes, for data originating from a plethora of enterprise and SAP, leading into to ongoing data governance organization design.
  • Performed data analysis and data profiling using complex SQL queries on various sources systems including Oracle, and Netezza.

Environment: Erwinr9.6, Oracle 12c, Netezza, PL/SQL, T-SQL, MDM,SQL Sever2014, Informatica Power Center, SQL, Hadoop, Hive Queries, MongoDB, Tableau Excel, MS Access, SAP etc.

Confidential, St. Louis, MO

Sr. Data Architect/Modeler

Responsibilities:

  • As an Architect, implemented MDM hub to provide clean, consistent data for a SOA implementation.
  • Performed Data analysis and data profiling using complex SQL on various sources systems including Oracle 10g/11g and Teradata.
  • Developed Conceptual Model and Logical Model using Erwin based on requirements analysis.
  • Created the best fit Physical Data Model based on discussions with DBAs and ETL developers.
  • Participated in the design, development, and support of the corporate operation data store and enterprise Data warehouse database environment.
  • Documented a whole process of working with Tableau Desktop, installing Tableau Server and evaluating Business Requirements.
  • Implemented dimension model (logical and physical data modeling) in the existing architecture using Erwin9.5.
  • Involved in Database using Teradata 14.1, Big Data and NoSQL.
  • Developed, managed and validated existing Data Models including Logical and Physical Models of the Data Warehouse and source systems utilizing a 3NFmodel.
  • Designed Source to Target mapping from primarily Flat files, SQL Server, Oracle 11g.
  • Worked on Normalization and De-Normalization techniques for both OLTP and OLAP systems.
  • Extensively worked in Client-Server application development using Oracle 11g, Oracle Import and Export Utilities.
  • Extensively used SQL Loader to load data from the Legacy systems into Oracle databases using control files and used Oracle External Tables feature to read the data from flat files into Oracle staging tables.
  • Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases
  • Created ETL packages using OLTP data sources (SQL Server 2008, Flat files, Excel source files, Oracle) and loaded the data into target tables by performing different kinds of transformations using SSIS.
  • Designed Logical data model and Physical Conceptual data documents between source systems and the target data warehouse.
  • Used External Loaders like Multi Load, T Pump and Fast Load to load data into Teradata Database analysis, development, testing, implementation and deployment.
  • Developed requirements, perform data collection, cleansing, transformation, and loading to populate facts and dimensions for data warehouse
  • Created, managed, and modified logical and physical data models using a variety of data modeling philosophies and techniques including Inmon or Kimball
  • Maintaining data mapping documents, business matrix and other data design artifacts that define technical data specifications and transformation rules
  • Managed the Master Data Governance queue including assessment of downstream impacts to avoid failures
  • Worked in the capacity of ETL Developer (Oracle Data Integrator (ODI) / PL/SQL) to migrate data from different sources in to target Oracle Data Warehouse.
  • Worked with high volume datasets from various sources like SQL Server 2014, Oracle and Text Files.
  • Created named sets, calculated member and designed scope in SSAS, SSIS, SSRS.
  • Worked on Teradata SQL queries, Teradata Indexes, MDM Utilities such as Mload, Tpump, Fast load and Fast Export.
  • Written SQL scripts to test the mappings and Developed Traceability Matrix of Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Created SSIS packages to load data from different sources such as Excel, Flat file to SQL server Data warehouse and SQL Server, PL/SQL Transactional database.

Environment: Erwin9.5, SSIS, SSRS, SAS, Excel, MDM, PL/SQL, ETL, Tableau, Hadoop, Hive, Pig, Mongo, Aginity, Teradata SQL Assistant, PL/SQL, T-SQL, Cognos, Oracle11g, SQL,etc.

Confidential, Minneapolis, MN

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Providing accurate and flexible capacity Modeling for key stakeholders, Delivering customized, competitive, and effective Data analytics and reporting, and having good experience in Data Mining as well.
  • Planning, data management & analysis, write-up and data presentation.
  • Involved in reviewing business requirements and analyzing data sources form Excel/Oracle SQL Server for design, development, testing, and production rollover of reporting and analysis projects.
  • Used IBM Info sphere Data stage and analyzed, designed, developed, implemented and maintained jobs.
  • Coordinated with DB2 on database build and table normalizations and de-normalizations.
  • Developing and enhancing modeling process (requirements, design, implementation, and measurement) of statistical analysis, predictive models, segmentation and other statistical techniques; introduce new methods and tools to drive analytic innovation.
  • Created Logical and physical dimensional models for presentation layer and dim layer for a dimensional data warehouse in Erwin 9.1 as well as Built a Logistic Model on DBS customer data.
  • Engaged in model development and usage as well as participate in business intelligence consulting around all cycles of consumer credit: prospecting; acquisition; portfolio management; and collections.
  • Improve proof of concept tests, quality control, customer segmentation, targeting propensity models (response, attrition models) and customer lifetime value model.
  • Conducted brain storming sessions with application developers and DBAs to discuss about various de normalization, partitioning and indexing schemes for Physical Model.
  • Involved in several facets of MDM implementations including Data Profiling, metadata acquisition and data migration.
  • Involved in extensive Data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Extensively worked with Netezza database to implement data cleanup, performance tuning techniques.
  • Developing reusable objects like PL/SQL program units and libraries, database procedures and functions, database triggers to be used by the team and satisfying the business rules.
  • Created and Configured Workflows, Work lets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
  • Wrote R scripts to perform text mining on questionnaires from customers of All state Banking Companies and categorized the documents into positive or negative reviews.
  • Performed data validation on the flat files that were generated in UNIX environment using UNIX commands as necessary.
  • Involved in Troubleshooting and quality control of data transformations and loading during migration from Oracle systems into Netezza EDW.
  • Worked with NZ Load to load flat file data into Netezza, DB2 and Architect to identify proper distribution keys for Netezza tables.

Environemnt: Erwin9.1, PL/SQL, MDM, SQL Server 2008, Netezza, DB2, Datastage, Informatica, SQL, T-SQL, UNIX, Netezza, SQL assistance etc.

Confidential, Pennsylvania

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • Built Analytics team's Data infrastructure, voter contact, digital, and Modeling data and also worked as Data and Geospatial Analyst.
  • Performed Data Modeling, Database Design, and Data Analysis with the extensive use of ER/Studio.
  • Worked with Business users during requirements gathering and business analysis to prepare high level Logical Data Models and Physical Data Models.
  • Performed Reverse Engineering of the current application using ER/Studio and developed Logical and Physical Data Models for Central Model consolidation.
  • Involved in integration of various relational and non-relational sources such as Teradata 13.1, SFDC, SQL Server, COBOL, XML and Flat Files.
  • Involved in Normalization /De-normalization, Normal Form and database design methodology. Expertise in using data modeling tools like MS Visio and ER/Studio Tool for logical and physical design of databases.
  • Worked on Key performance Indicators (KPIs), design of star schema and snowflake schema in Analysis Services.
  • Created mappings using pushdown optimization to achieve good performance in loading data.
  • Documented ER Diagrams, Logical and Physical models, business process diagrams and process flow diagrams.
  • Created reports in Oracle Discoverer by importing PL/SQL functions on the Admin Layer, in order to meet the sophisticated client requests.
  • Extensively used SQL, Transact SQL and PL/SQL to write stored procedures, functions, packages and triggers.
  • Created tables, views, sequences, indexes, constraints and generated SQL scripts for implementing physical data model.
  • Created and fully automated Release Notes that communicate changes in state demographics to Data Directors.
  • Used Data mapping documents between Legacy, Production, and User Interface Systems were developed.
  • Star schema was developed for proposed central model and normalized star schema to snow flake schema.
  • Involved in implementing the Land Process of loading the customer Data Set into Informatica Power Center 9.5, MDM from various source systems
  • Worked with mapping parameters, variables and parameter files.
  • Tuning and code optimization using different techniques like dynamic SQL, dynamic cursors, tuning SQL queries, writing generic procedures, functions and packages
  • Created and Configured Workflows, Work lets, and Sessions to transport the data to target warehouse tables using Informatica Work flow Manager.
  • Extensively worked on Shell scripts for running SSIS programs in batch mode on UNIX.

Environment: ER/Studio, Teradata 13.1, SSIS, SAS, Excel, T-SQL, Tableau, Cognos, Pivot tables, Graphs, MDM, PL/SQL, Oracle, SQL, Teradata14.1, Informatica Power Center etc.

Confidential

Data Analyst

Responsibilities:

  • Report Analyst for Medical Databases creating reports and Data Analyst to QA/QC data which is extracted from large databases.
  • Involved in creating Logical and Physical data models with Star and Snowflake schema techniques using Erwin 8.x in Data warehouse as well as in Data Mart.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle 8.x and Netezza.
  • Involved in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Writing and executing customized SQL code for ad hoc reporting duties and used other tools for routine.
  • Extensively used SQL to develop reports from the existing relational data warehouse tables (Queries, Joins, Filters, etc)
  • Created Mappings for Initial load from MS SQL server 2005 to Netezza while performing data cleansing.
  • Converted SQL Server packages to Informatica Power Center mapping to be used with Netezza.
  • Involved in Informatica MDM processes including batch based and real-time processing.
  • Performed ad-hoc distributed queries by using SQL, PL/SQL, MS Access, MS excel and UNIX to meet business Analyst's needs.
  • Responsible for reviewing data model, database physical design and Presentation layer design.
  • Conducted meetings with business and development teams for data validation and end-to-end data mapping.
  • Developed logging for ETL load at the package level and task level to log number of records processed by each package and each task in a package using Informatica.
  • Involved in Oracle, SQL, PL/SQL, T-SQL queries programming and creating objects such as stored procedures, packages, functions, triggers, tables and views.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.

Environment: ERwin, MDM, ETL, MS SQL Server 2005, PL/SQL, Netezza, Oracle, IBM etc.

We'd love your feedback!