We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

2.00/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • Over 8 years of Experience in Data Analysis and developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP), Online Analytical Processing (OLAP), and MDM.
  • Experienced working with data modeling tools like Erwin, Power Designer and ER Studio.
  • Experienced in designing star schema, Snowflake schema forDataWarehouse, ODS architecture.
  • Excellent experience in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
  • Experienced in design and development of ETL processes from databases such as Teradata, DB2, Netezza, Oracle, and Flat files sources.
  • Experienced in designing theDataMarts in dimensional data modeling using star and snowflake schemas.
  • Hands - on experience in system architecture, data management, integration, governance, data quality, data warehousing, relational database management systems (RDBMS), data modeling and extract, transform and load (ETL).
  • Expertise in T-SQL (DML/DDL) for creating Tables, Stored Procedures, Views, Indexes, Cursors, Triggers, User Profiles, User Defined Functions(UDF), Relational Database Models and Data Integrity in observing Business Rules.
  • Experienced in using Aginity Netezza work bench to perform various DML, DDL etc. operations on Netezza database.
  • Proficient with Data Analysis, mapping source and target systems fordatamigration efforts and resolving issues relating todatamigration.
  • Experienced in Data Scrubbing/Cleansing, Data Quality, Data Mapping, Data Profiling, Data Validation in ETL
  • Excellent proficiency in writing SQL (including ORACLE and Teradata).
  • Experienced in creating and documenting Metadata for OLTP and OLAP when designing systems.
  • Excellent Knowledge of Ralph Kimball and BillInmon's approaches toDataWarehousing.
  • Extensive experience in development of T-SQL, DTS, OLAP, PL/SQL, Stored Procedures, Triggers, Functions, Packages, performance tuning and optimization for business logic implementation.
  • Proficient in System Analysis, ER/Dimensional Data Modeling, Database design and implementing RDBMS specific features.
  • Well versed in conducting Joint Application Design (JAD) and Joint Requirement Planning (JRP) sessions, User Acceptance Testing (UAT), Gap analysis, Cost benefit analysis and ROI analysis.
  • Efficient in analyzing and documenting business requirement documents (BRD) and functional requirement documents (FRD) along with Use Case Modeling and UML.
  • Experienced in using BTEQ and SQL Assistant front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Experienced in developing Entity-Relationship diagrams and modeling Transactional Databases and DataWarehouse using tools like ERWIN, ER/Studio and Power Designer.
  • Experienced with modeling using ERWIN in both forward and reverse engineering cases.
  • Good in Data warehouse loads, determining hierarchies, building various logics to handle Slowly Changing Dimensions.
  • Excellent Team player to work in conjunction with Business analysts, Production Support teams, Subject Matter Experts, Database Administrators and Database developers.
  • Exceptional problem solving and sound decision making capabilities, recognized by alternative solutions.

TECHNICAL SKILLS

Analysis and Modeling Tools: Erwin 7.2/7.0, 8.0, r9, Sybase Power Designer, ER/Studio.

ETL Tools: Informatica Power Center, SSIS

OLAP Tools: MS SQL Analysis Manager, DB2 OLAP, CognosPowerplay

Languages: SQL, PL/SQL, T-SQL, XML, HTML, UNIX Shell Scripting.

Databases: MS SQL Server 2014, DB2, Teradata 12, 13,14x, 15, Netezza.

Operating Systems: Windows, UNIX (Sun Solaris 10)

Project Execution Methodologies: Ralph Kimball and Bill Inmon data warehousing methodology, Rational Unified Process (RUP), Rapid Application Development (RAD), Joint Application Development (JAD), Agile and Scaled Agile Framework

Tools: & Software: TOAD, MS Office, BTEQ, Teradata SQL Assistant, Aginity Netezza

Tools: MS-Office suite (Word, Excel, MS Project and Outlook), VSS

Programming Languages: SQL, T-SQL, Base SAS and SAS/SQL, HTML, XML

PROFESSIONAL EXPERIENCE

Confidential, Dallas TX

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Created and maintained Logical and Physical models for thedatamart. Created partitions and indexes for the tables in thedatamart.
  • Performeddataprofiling and analysis applied variousdatacleansing rules designeddatastandards and architecture/designed the relational models.
  • Maintained metadata (datadefinitions of table structures) and version controlling for thedatamodel.
  • Developed SQL scripts for creating tables, Sequences, Triggers, views and materialized views
  • Establishes processes for governing the identification, collection, and use of metadata; takes steps to assure metadata accuracy and validity and Assesses and determines governance, stewardship, and frameworks for managing data across the organization.
  • Conducted performance analysis and created partitions, indexes and Aggregate tables.
  • Responsible for Extracting, Transforming and Loading data using different tools like BTEQ, Fast Load, Multi Load from flat files and then loading them into Teradata Data Warehouse
  • Utilized Erwin's forward/reverse engineering tools and target database schema conversion process.
  • Transform Data Quality from an application level to an enterprise level
  • Developed new Data governance business process/ procedures within an Agile Project Management PLMP Environment.
  • Provided training, testing and User support for the new Data Governance initiatives.
  • Lead design of high-level conceptual and logical models that facilitate a cross-system/cross functional view of data requirements
  • Several Netezza SQL scripts are written to load data into Netezza tables.
  • Oversaw mapping of data sources, data movement, interfaces, and analytics, with the goal of ensuring data quality.
  • Established methods and procedures for tracking data quality, completeness, redundancy, and improvement.
  • Tuning the Informatica Mappings for optimum performance and scheduling ETL Sessions.
  • Performed data cleansing and transformation using Teradata Utilities.
  • Data deliverables included requirements, data analysis including profiling, modeling, data flow diagrams, data sourcing plans, database architecture
  • Worked on optimizing and tuning the Netezza SQLs to improve the performance of batch.
  • Developed SQL scripts for loading the aggregate tables and rollup dimensions and performed unit testing, system integrated testing for the aggregate tables.
  • Performeddataanalysis on the target tables to make sure thedataas per the business expectations.
  • Developed SQL scripts for loadingdatafrom staging area to target tables and proposed the EDW data design to centralize the data scattered across multiple datasets.
  • Extensively involved in creating procedure, packages, functions and triggers in PLSQL.
  • Tuning of Teradata SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Provided feedback to business and management relative to issues and risks surrounding data governance initiatives.
  • Performed theDataMapping,Datadesign (DataModeling) to integrate thedataacross the multiple databases in to EDW.
  • Migrated the critical reports using PL/SQL&UNIX packages.

Environment: Oracle 12c, SQL Plus, Erwinr9.6, Oracle Exadata, Teradata SQL Assistant, MS Visio, Windows XP, QC Explorer, Business Objects, Oracle SQL, Aginity Netezza, QuickData, Informatica, UNIX, Metadata.

Confidential, Overland Park, KS

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Worked with SME's and other stakeholders to determine the requirements to identify Entities and Attributes to build Conceptual, Logical and PhysicaldataModels.
  • Created a logical design and physical design in Erwin.
  • Enforced referential integrity in the OLTPdatamodel for consistent relationship between tables and efficient database design.
  • Worked exclusively with the Teradata SQL Assistant to interface with the Teradata.
  • Conduct data analysis using SQL & Excel as part of requirements analysis and to assist data quality teams.
  • Gathering user reporting and analytical requirements by working with business and reporting teams.
  • Optimized high volume tables (Including collection tables) in Teradata using various join index techniques, secondary indexes, join strategies and hash distribution methods.
  • Analyzing source system data, existing data models, and profiling data (using SQL)
  • Creating star schema models (using Erwin) with data flow diagrams, well documented metadata definitions and source-to-target data mappings.
  • Created mappings for Type2/Slowly Changing Dimensions for updating and inserting of records and in turn maintaining history.
  • Converted Oracle stored procedures to Netezza Equivalent.
  • Developed Data Mapping, Data Governance, and transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Generated ad-hoc reports using Crystal Reports.
  • Involved indatamodeling and providing technical solutions related to Teradata to the team.
  • Designed the physicalmodel for implementing the model into Teradata physicaldatabase.
  • Wrote Pre and post session Shell scripts for extracting data from files, remove duplicates and sorting in the database to optimize performance.
  • Extensively worked on Informatica to extract data from flat files and Oracle, and to load the data into the target database.
  • Used Erwin to create logical and physicaldatamodels for enterprise wide OLAP system.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conductdataanalysis.
  • Written several Teradata BTEQ scripts to implement the business logic.
  • Actively partner with related SDLC, Operational Risk Management and Enterprise Risk Management partners to improve the Data Governance process imbedded in these areas.
  • Extensively used MS Visio for representing existing and proposeddataflow Diagrams.
  • Participated in JAD sessions to solve the revolving issues between the executive teams and developers.

Environment: Erwin 9.5, MS Visio, Oracle 11g, Oracle Designer, SQL Server 2012, Oracle SQL developer 2008, DATAFLUX 6.1, PL/SQL Developer, Informatica Power Centre, SQL, SQL Navigator Crystal Reports 9, Netezza, Teradata SQL Assistant.

Confidential, Cleveland Ohio

Sr. Data Analyst/Data Analyst

Responsibilities:

  • Used Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Involved in preparing LogicalDataModels/PhysicalDataModels.
  • Proficient in EnterpriseDataWarehouse. Worked extensively in both Forward Engineering as well as Reverse Engineering usingdatamodeling tools.
  • Involved in the creation, maintenance ofDataWarehouse and repositories containing Metadata.
  • Resolved thedatatype inconsistencies between the source systems and the target system using the MappingDocuments and analyzing the database using SQL queries.
  • Involved heavily in writing complex SQL queries to pull the required information from Database using Teradata SQL Assistance
  • Extensively used both Star Schema and Snow flake schema methodologies in building and designing the logicaldatamodel in both Type1 and Type2Dimensional Models.
  • Worked with DBA group to create Best-FitPhysicalDataModel from the LogicalDataModel using ForwardEngineering.
  • Used Normalization methods up to 3NF and De-normalization techniques for effective performance in OLTP systems.
  • Loading of data from various resources like Oracle, SQL server, Flat Files into asingle Data Warehouse.
  • DevelopedDataMigration and Cleansing rules for the IntegrationArchitecture(OLTP, ODS, DW).
  • Used Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, Fast Load, Multi Load, Fast Export, Tpump on UNIX/Windows environments and running the batch process for Teradata.
  • Documented logical, physical, relational and dimensionaldatamodels. Designed thedatamarts in dimensionaldatamodeling using star and snow flake schemas.
  • Written several shell scripts using UNIX Korn shell for file transfers, error logging, data archiving, checking the log files and cleanup process.
  • Worked on data modeling and produced data mapping and data definition specification documentation.
  • Worked on database connections, SQL joins, aliases, views, aggregate conditions and also wrote various PL/SQL procedures, Functions and Triggers for processing business logic in the database.
  • Generated reports using Teradata BTEQ.
  • Written Procedures and Functions using Dynamic SQL and written complex SQL queries using joins, sub queries and correlated sub queries.

Environment: Erwin, Oracle11g, SQL server 2008, MS Excel, MS Visio, Informatica, Rational Rose, Requisite Pro, Rational, Teradata, Netezza, Windows 7, PL/SQL, MS Office, MS Access, MS Visio.

Confidential

Data Modeler/Data Analyst

Responsibilities:

  • Designed logical and physicaldatamodels for multiple OLTP and Analytic applications.
  • Created businessdataconstraints, indexes, sequences etc. as needed in the Physicaldatamodel.
  • Extensively used the Erwin design tool &Erwin model manager to create and maintain the DataMart.
  • Extensively used Star Schema methodologies in building and designing the logicaldatamodel into Dimensional Models
  • Created reusable transformations and mapplets based on the business rules to ease the development
  • Created PhysicalDataModel from the LogicalDataModel using Forward engineering using Erwin
  • Used MicroStrategy to generate reports.
  • Created stored procedures using PL/SQL and tuned the databases and backend process.
  • Involved withDataAnalysis primarily IdentifyingDataSets, SourceData, Source MetaData, Data Definitions andDataFormats
  • Tested the ETL process for both beforedatavalidation and afterdatavalidation process. Tested the messages published by ETL tool anddataloaded into various databases.
  • Experienced in creating UNIX scripts for file transfer and file manipulation.
  • Wrote simple and advanced SQL queries and scripts to create standard and ad hoc reports for senior managers.
  • Used Expert level understanding of different databases in combinations forDataextraction and loading, joiningdataextracted from different databases and loading to a specific database.
  • Designed and Developed PL/SQL procedures, functions and packages to create Summary tables.

Environment: SQL Server 2005, UML, Business Objects 5, Teradata, Windows XP, SSIS, SSRS, Erwin, DB2, Informatica, Clear Case forms, Unix and Shell Scripting.

Confidential

Data Analyst

Responsibilities:

  • Performed numerousdatapulling requests using SQL for analysis.
  • Enforced referential integrity in the OLTPdatamodel for consistent relationship between tables and efficient database design.
  • Proficient in importing/exporting large amounts ofdatafrom files to Teradata and vice versa.
  • DevelopedDataMapping,DataGovernance, and Transformation and cleansing rules for the MasterData Management Architecture involving OLTP, ODS.
  • Generated graphs using MS Excel Pivot tables, PowerPoint deck for senior management review.
  • Identified and tracked the slowly changing dimensions, heterogeneous sources and determined the hierarchies in dimensions.
  • Created and Maintain Teradata Databases, Users, Tables, Views, SP and Macro
  • Designed and developed Ad-hoc reports as per business analyst, operation analyst and project manager data requests.
  • Utilized ODBC for connectivity to Teradata &MS Excel for automating reports and graphical representation ofdatato the Business and OperationalAnalysts.
  • Extracteddatafrom existingdatasource, Developing and executing departmental reports for performance and response purposes by using oracle SQL, MS Excel.
  • Extracteddatafrom existingdatasource and performed ad-hoc queries.

Environment: UNIX scripting, Oracle SQL Developer, Teradata, Windows XP, SASdatasets

We'd love your feedback!