We provide IT Staff Augmentation Services!

Sas Data Analyst/data Modeler Resume

0/5 (Submit Your Rating)

Dallas, TX

SUMMARY

  • 10+ years of extensive experience in the complete Software Development Life Cycle (SDLC) covering Requirements Management, Data Modeling, Data Analysis, Data Mapping, System Analysis, Architecture and Design, Development, Testing and Deployment of business applications, business analysis.
  • Strong Data Mapping experience using ER diagram, Dimensional/Hierarchical data modeling, Star Schema modeling, Snowflake modeling using tools like Erwin, E/RStudio and Sybase Power Designer.
  • Experienced creating DDLscripts for implementing Data modeling changes.
  • Experienced creating ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co - coordinated with DBAs’ to apply the data model changes.
  • Excellent in various SAS procedures for data manipulation like Proc Summary, Proc Report, Data NULL, Proc Tabulate, ProcFreq, ProcUnivariate, Proc Means, Proc SQL, Merge, Proc Sort, SAS Informats /Formats etc.
  • Expertise in using data modeling Tools like ERWIN, Power Designer and E/R Studio.
  • Excellent experience with Normalized tables and dimensions up to 3NF in order to optimize the performance.
  • Proficient in Hadoop, Hive, MapReduce, Pig and NOSQL databases like MongoDB, HBase, Cassandra.
  • Experienced in Data Integration techniques like Data Extraction, Transformation and Loading(ETL) from disparate Data Source databases like Oracle, SQL Server, MS Access, flat files, CSV files and XML files into target warehouse using various transformations in Informatica, SSIS.
  • Expertise in producing graphs in SAS by utilizing procedures in SAS/GRAPH like proc GPLOT and Proc GCHART.
  • Expertise in Physical Modeling for multiple platforms such as Oracle/Teradata/ SQL Server/DB2/Netezza.
  • Expertise in Source to Targetdata mapping, Standardization Document Slowly Changing Mapping Creation, Star/Snowflake Schema Mapping Creation, RDMS, Building Data Marts and Meta Data Management.
  • Experienced using Teradata SQL Assistant and data load/export utilities like BTEQ, FastLoad (Fload), Multi Load(MLoad), Fast Export, and exposure to Tpumpon UNIX/Windows environments.
  • Experienced Creating the Logical and Physical design of the Data Warehouse (both Fact and Dimension tables) using STAR Schema and Snowflake Schema approach.
  • Proficient in Normalization (1NF/2NF/3NF) /De-normalization techniques in relational/dimensional database environments.
  • Experienced with Teradata SQL and TeradataUtilities and extensively worked on Teradata data modeling projects.
  • Solid understanding of Rational Unified Process (RUP) using Rational Rose, Requisite Pro, Unified Modeling Language (UML), Object Modeling Technique (OMT), Extreme Programming (XP), and Object Oriented Analysis (OOA).
  • Extensive expertise in handling and processing programs using SAS BASE, SAS/STAT, SAS/MACROS, SAS /ACCESS, SAS VISUAL ANALYTICS etc.
  • Strong experience in conducting User Acceptance Testing (UAT) and documentation of Test Cases. Expertise in designing and developing Test Plans and Test Scripts.
  • Expertise in relational database concepts, dimensional database concepts, and database architecture & design for Business Intelligence and OLAP.
  • Good Experience in database design using PL/SQL, SQL, T-SQL to write Stored Procedures, Functions, Triggers, Views.
  • Experienced in development methodologies like RUP, SDLC, AGILE, SCRUM and Waterfall.
  • Well versed with SDLC software development life cycle. Experience in dealing with different data sources ranging from Flat files, Oracle, Sybase, SQL server, Teradata, Ms Access and Excel.
  • Experienced with creating reports using Crystal Reports XI.
  • Extensive experience in loading high volume data, worked extensively with data migration, data cleansing and ETL processes.

TECHNICAL SKILLS

Data Modeling Tools: Erwin r 9.6/9.5/9.1/8. x/7.x, E/R Studio, Sybase Power Designer, MS Visio

DataAnalysis: SAS, SQL, R

Databases: Oracle 12c/11g/10g, IBM DB2, Teradata R15/R14/R13, MS SQL Server, MS Access, Netezza

Languages: SQL, PL/SQL, T-SQL, HTML, Java, Visual Basic, R, UNIX Shell Scripting, Perl.

Operating Systems: Windows NT/XP/2000/7, UNIX, Linux, Sun Solaris

Others Tools: MS Office, SharePoint, Lotus Notes, Mega, Aginity, Teradata SQL Assistant

ETL Tools: Informatica Power Centre, SSIS

BI Tools: Tableau, Tableau server, Tableau Reader, SAP Business Objects, OBIEE, QlikView, SAP Business Intelligence, Amazon Redshift, or AzureDataWarehouse.

Big Data Techs: Hadoop, Hive, HDFS, MapReduce, Pig, HBase, Cassandra, MongoDB.

PROFESSIONAL EXPERIENCE

Confidential, Burlington NJ

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Interaction with Business Analyst, SMEs and other Data Architects to understanding Business needs and functionality for various project solutions.
  • Analysed business, data and systems requirements in order to perform data modelling and design (high level conceptual business/data models, entity-relationship and dimensional logical data models, and detailed physical database design models).
  • Build Conceptual, Logical and Physical Data Models for OLTP Systems and publishing LDM, PDM and Data dictionary at the end of each sprint.
  • Responsible for the consistency of data design across projects by adhering to standards, following roadmaps and implementing strategic initiatives.
  • Designs the logical data model consisting of entities, attributes and relationships between entities and documents the model in CA Erwin.
  • Handled importing data from various data sources, performed transformations using Hive, Map Reduce, and loaded data into HDFS.
  • Managed all indexing, debugging and query optimization techniques for performance tuning using T-SQL.
  • Developed complex Teradata SQL code in BTEQ script using OLAP and Aggregate functions to name few.
  • Working in Agile Data Modelling methodology and creating data models in sprints in SOA architecture
  • Involved in data governance processes, author data transformation scripts, perform data cleansing actions, and execute data integration/consolidation processes.
  • Worked with different sources such as Oracle, Teradata, SQL Server2012 and Excel, Flat, Complex Flat File, Cassandra, MongoDB, HBase, and COBOL files.
  • Created a Conceptual, Logical Design and Physical Design in Erwin and used Erwin for developing Data Model using Star and Snow Flake Schema methodologies.
  • Conducted data model reviews with developers, architects, business analysts and subject matter experts to collaborate and gain consensus.
  • Used E.F Codd's Normalization (1NF, 2NF & 3NF) and Demoralization techniques for effective performance in OLTP and OLAP systems.
  • Designed and developed various applications using oracle products/tools for process automations software into database server using UNIX shell scripts, PL/SQL packages, procedures and database triggers based on client/server and multi-tier technologies.
  • Write complex SQL queries to pull the required information for business use from database using Teradata SQL Assistant.
  • Documented the source to target mapping spread sheet which have all the information of source and target data types and allthe necessary transformation rules which are in turn used for their metadata updates.
  • Extensively worked on Teradata tools & utilities like Fast Load, Multi Load, TPump, Fast Export, Teradata Parallel Transporter (TPT) and BTEQ.
  • Created source to target mapping specifications using Informatica data quality tool.
  • Maps data entities and attributes from source to target data stores and creates documentation of data mapping.
  • Executed SQL queries to retrieve data from databases for analysis and Created NetezzaSQL scripts to test the table loaded correctly
  • Maintaining large data sets, combining data from various sources in varying formats to create SAS data sets by using Set and Merge for generating Reports and graphs and Extensively using Proc SQL and Macros to extract the data and make reports.
  • Populated or refreshed Teradata tables using FastLoad (FLoad), MultiLoad (MLoad), TD load utilities.
  • Develops and performs standard queries to ensure data quality, identify data inconsistencies, missing data and resolve as needed.
  • Created partitioned and bucketed tables in Hive. Involved in creating Hive internal and external tables, loading with data and writing hive queries which involves multiple join scenarios.
  • Created in tables, indexes and designing constraints and wrote T-SQL statements for retrieval of data and involved in performance tuning of T-SQL Queries and Stored Procedures.
  • Involved in designing and architecting data warehouses and data lakes on regular (Oracle, SQL Server) high performance (Netezzaand Teradata) and big data (Hadoop - MongoDB, Hive, Cassandra and HBase) databases.
  • Cleaning up the data models in Erwin, updating metadata information where necessary and worked in importing and cleansing ofdatafrom various sources like Teradata, Oracle, flat files, SQL Server 2012 with high volumedata.

Environment: Erwin r9.6, Informatica 9.5, Oracle 12c, Teradata 15, DB2, Business Objects, SQL Server 2008/2012, SQL, PL/SQL, IBM DB2, VBA MS Excel, MS Excel, NetezzaAginity, Teradata SQL Assistant, Metadata, UNIX, SSIS, SSRS, SAS, SAS Enterprise Guide 4.2/5.1, MS Access and Macros, Hadoop, HiveQL Queries, MongoDB, Cassandra, Pig.

Confidential, MN

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • Gathered& identified business data requirements from business partners and development teams, understand the information needs and translate those data requirements into conceptual, logical and physical database models.
  • Created Conceptual, Logical and Physical data models as per enterprise standard and designed the Logical Model into Dimensional Model using Star Schema and Snowflake Schema on Erwin to build data warehouse.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Wrote, tested and implemented Teradata FastLoad, MultiLoad and BTEQ scripts, DML and DDL.
  • Involved in customized reports using SAS/MACRO facility, PROC REPORT, PROC TABULATE and PROC.
  • Interpreted stakeholder functional and information needs and created high level business information models and/or conceptual data models understandable by the business and application owners.
  • Reverse engineered the existing data marts and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for reports.
  • Cleansed unnecessary tables and columns and redefined various attributes and relationships in the Reverse Engineering Model.
  • Provided centralized direction for data catalogs, metadata repositories, data definitions, and data relationships
  • Created data models and analytical systems for OLAP and assisted in data extraction, transformation and loading process (ETL)
  • Worked on BTEQ scripting and as part of it built complex SQLs to map the data as per the requirements and Involved in data governance activities such as setting standards for data definition, class words.
  • Created SAS datasets from Oracle database with random sampling technique and created Oracle tables from SAS datasets by using SAS Macros.
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc.
  • Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning using Netezza Database.
  • Mapped the business requirements and new databases to the logical data model, which defines the project delivery needs and Generated DDL statements and scripts from Physical Data Model to create objects like table, views, indexes, stored procedures and packages.
  • Conducted database performance tuning techniques (database objects, SQL, T-SQL, and PL/SQL) includingNormalization/De-normalization, Indexes, Table Partitioning, Parallel Processing, Caching, and Data Compression.
  • Designed the Data Warehouse and MDM hub Conceptual, Logical and Physical data models.
  • Transformations including aggregation, summation construct from source to data warehouse and adjusted and maintained SQL script and perform further data analysis and data aggregation.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conductdataanalysis.
  • Profiled data to conclude primary index, secondary index and unique index and maintained and enforce data architecture/administration standards, as well as standardization of column name abbreviations, domains and attributes.
  • Worked on SQL Server concepts SSIS (SQL Server Integration Services), SSAS (Analysis Services) and SSRS (Reporting Services).
  • Created and maintained Meta data for the enterprise such as data dictionary, data definition.

Environment: Erwin r9.5, Oracle 11g, Teradata 14, DB2, SSIS, Business Objects, SQL Server 2005/2008, MS Excel, Teradata SQL Assistant, Aginity, UNIX, Informatica, SAS9.3,SAS/Macros, SAS/SQL,SAS/ACCESS,SAS/STAT, Hadoop, HiveQL Queries, Tableau, Netezza, SQL, T-SQL.

Confidential, Cincinnati OH

Sr. Data Modeler/ Data Analyst

Responsibilities:

  • Conducted or performed data modelling exercises in support of subject areas and/or specific client needs for data, reports, or analyses, with a concern towards reuse of existing data elements, alignment with existing data assets and target enterprise data architecture.
  • Developed conceptual, logical, and physical Enterprise Data Model based on industry standards.
  • Involved in preparing logical data models and conducted controlled brain-storming sessions with project focus groups.
  • Used Model Mart of ER/Studio for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Involved in designing Context Flow Diagrams, Structure Chart and ER- diagrams and worked on database features and objects such as partitioning, change data capture, indexes, views, indexed views to develop optimal physical data mode
  • Developed processes on both Teradata and Oracle using shell scripting and RDBMS utilities such as Multi Load, Fast Load, Fast Export, BTEQ (Teradata) and SQL*Plus, SQL*Loader(Oracle).
  • Used Teradata SQL Assistant, Teradata Administrator, PMON and data load/export utilities like BTEQ, FastLoad, Multi Load, Fast Export, Tpump on UNIX/Windows environments and running the batch process for Teradata.
  • Translated the business requirements into workable functional and non-functional requirements at detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modelling.
  • Performed statistical data analysis, generated reports and graphs using SAS/Base, SAS/Macros, SAS/SQL, SAS/Access and SAS/Graph.Generated reports using Microsoft Excel Pivot tables and used SAS Base procedures FREQ, SQL, SORT, TABULATE, and MEANS to analyse the data.
  • Developed Logical Dimensional Models and processed the key facts and dimensions required for the business support.
  • Applied Hot-bug fixes and also version patches for Netezza, Oracle, SQL server and Informatica in Windows environment.
  • Involved in database testing, writing complex SQL queries to verify the transactions and business logic like identifying the duplicate rows by using SQLDeveloperand PL/SQLDeveloper.
  • Designed the Logical Model into Dimensional Model using Star Schema and Snowflake Schema.
  • Worked with SQL, SQL PLUS, Oracle PL/SQL Stored Procedures, Triggers, SQL queries and loading data into Data Warehouse/Data Marts.
  • Rationalized the relationships expressed in the logical data model to the schema of the physical data store (that is, database tables and columns).
  • Involved in the design and development of user interfaces and customization of Reports using Tableau and OBIEE and designed cubes for data visualization, mobile/web presentation with parameterization and cascading.
  • Used SAS extensively to create ad hoc reports, match merge and created graphs by using SAS/GRAPH base. Generated output using SAS/ODS in CSV, XLS, DOC, PDF and HTML formats.
  • Generated DDL Scripts from Physical Data Model using technique of Forward Engineering in ER/Studio.
  • Developed complex SQL queries, and perform execution validation for remediation and Analysis.
  • Migrated the source code from Oracle into Netezza Database and performed the Data Modelling effort for the gaps identified while data mapping.
  • Developed SQL queries on Teradata in order to get the data from different tables using joins and database links.

Environment: ER/Studio, Windows 7, Microsoft Office SharePoint 2007, Cognos, Rational Requisite Pro, MS Office (Word, Excel and PowerPoint), Teradata, MS Project, Netezza, MS FrontPage 2003, MS Access, CSV files, EDI, Documentum 2.0., UML, Java, Erwin, MS Visio, Oracle 11g, Oracle Designer, SQL Server 2008, Oracle SQL developer 2008, Micro strategy 9.2, Tableau report builder, SQL, SAS, T-SQL

Confidential, Boise, ID

Data Modeler/SAS Data Analyst

Responsibilities:

  • Worked with Business Architects and System Analysts to gather business data elements from business requirements and translate them into a Logical Data Model.
  • Defined, developed and delivered consistent information and data standards, methodologies, guidelines, best practice and approved modelling techniques around data quality, data governance and data security.
  • Coding SAS programs with the use of Base SAS and SAS/Macros for ad hoc jobs and Modified data using SAS/BASE and Macros.
  • Partnered with subject matter experts, architects and developers to capture and analyze business needs to complete all data modelling related arti-facts including Conceptual, Logical and Physical Data Models.
  • Prepared new Datasets from raw data files using Import Techniques and modified existing datasets using PROC REPORT, PROC SORT, PROC FREQ and PROC MEANS and DATA NULL .
  • Created complex and reusable Macros and extensively used existing macros and developed SAS Programs for Data Cleaning, Validation, Analysis and Report generation. Tested and debugged existing macros.
  • Identified objects and relationships and how those all fit together as logical entities, these are then translated into physical design using the forward engineering in E/R Studio tool.
  • Performed database tuning and optimize complex SQL queries using Teradata Explain, stats and indexes.
  • Worked extensively on SQL querying using Joins, Alias, Functions, Triggers and Indexes.
  • Worked on Designer Tools like Source Qualifier, Aggregate, Lookup, Expression, Normalizer, Filter, Router, Rank, Sequence Generator, Update Strategy and Joiner.
  • Developed Star and Snowflake schemas based dimensional model to develop the data warehouse and involved in extracting, analyzing data from the data-warehouse using SAS.
  • Used Reverse Engineering and Forward Engineering techniques on databases from DDL scripts and created tables and models in data mart, data warehouse and staging and cleansed unnecessary tables and columns and redefined various attributes and relationships in the Reverse Engineering Model.
  • Prepared new datasets from raw sets files using Import Techniques and modified existing datasets using Set, Merge, Sort, Update, Formats, and Functions.
  • Created Conceptual Data Model, Data Flow Diagram, Data Topology, Logical Data Model and Physical Data Model.
  • Developed scripts for loading the data into the base tables in EDW using FastLoad, MultiLoad and BTEQ utilities of Teradata.
  • Created and maintained metadata such as Data Elements dictionary for the entire Development Center Data Mart.
  • Defining, Manipulation, Controlling and Reporting/Storage (Query Language) of Data by using PROC SQL.
  • Conducted peer reviews of completed data models and plans to ensure quality and integrity from data capture through usage and archiving.
  • Used Mega Tool as well as E/R Studio to create Conceptual Data Model.

Environment: ER/Studio, Oracle 10g, Teradata 14, DB2, SSIS, Business Objects, SQL Server 2005/2008, ER/Studio Windows XP, MS Excel, Netezza, SQL, PL/SQL, Teradata SQL Assistant, SSRS, SSIS, SAS, SQL, T-SQL, Informatica, SAS 9.2/ SAS9.3, SAS/ODS, SAS/SQL, BASE SAS, SAS/ MACRO, SAS/ SQL.

Confidential, Dallas, TX

SAS Data Analyst/Data Modeler

Responsibilities:

  • Generated reports using SQL from oracle database, which were used for comparison with legacy systemand created Temporary Tables to store the data from Legacy system.
  • Identified the Objects and relationships between the objects to develop a logical model and translated the model into physical model using Forward Engineering in ERWIN.
  • Transformations including aggregation, summation construct from operational data source to data warehouse.
  • Developed Data Mapping, Data Governance, and Transformation and Cleansing rules for the Master Data Management.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models
  • Provided descriptive statistical analysis using tools like PROC FREQUENCY, PROC MEANS and PROC UNIVARIATE.
  • Used tools such as SAS/Access and SAS/SQL to create and extract oracle tables.
  • Developed high quality customized tables, reports and listings using PROC TABULATE, PROC SUMMARY and PROC REPORT.
  • Created SAS reports of clinical trial results by using Data Null procedures and Proc Report for submissions to FDA as per user requests.
  • Developed stored procedures and complex packages extensively using PL/SQL and shell programs.
  • Worked on multiple platforms to access external files such as MS-Word, MS-Excel etc. and store the retrieved SAS datasets.
  • Generated high quality reports using Proc Report, Proc Print, and Proc Tabulate and Data Null procedures.
  • Developed SAS MACROS for data cleaning and created high quality graphs using Proc GPLOT statements.
  • Normalized the database upto 3NF to put them into the star schema of the Data warehouse.
  • Used Teradata Utilities (BTEQ, Multiload, and Fast Load) to maintain the database. .
  • Developed, monitored the workflows and responsible for performance tuning of the staging and 3NF workflows.
  • Designed Data Marts using dimensional modeling and wrote PL/SQL ETL packages with Oracle PL/SQL to extract data from relational data store transform and load into the data mart.
  • Implemented Referential Integrity using primary key and foreign key relationships.
  • Identified and tracked slowly changing dimensions and determined the hierarchies in dimensions.
  • Translated business concepts into XML vocabularies by designing XML Schemas with UML.
  • Created entity-relationship diagrams, functional decomposition diagrams and data flow diagrams.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Denormalized the database to put them into the star schema of the data warehouse.
  • Understood existing data model and documented suspected design affecting the performance of the system. .
  • Conducted logical data model walkthroughs and validation.

Environment: Erwin r8.2, Oracle SQL Developer, Oracle Data Modeler, Teradata, SSIS, Business Objects, SQL Server 2005/2008, Windows XP, MS Excel, SAS, SAS Macros.

We'd love your feedback!