We provide IT Staff Augmentation Services!

Sr. Data Modeler / Data Analyst Resume

Columbus, OH

SUMMARY:

  • 8.5+ years experienced professional with competence in Data Analytics, Reporting, Data Modeling, Data Visualization and Dimensional Modeling. Expertise includes requirement gathering, decision support and implementation of various applications in OLTP and Data Warehousing
  • Analytical data lover with the ability to handle any form of data. Sound Knowledge of data source integration, working with ETL tools and Using Python for Statistical Data Analysis, Visualisation and Reporting
  • Expertise in modeling the physical and logical data models of existing database structures and extracting reports from databases
  • Proficient at working both independently and in collaboration with multidisciplinary teams to develop structural and functional data annotations
  • Insightful exposure to the methodologies of data quality, data cleansing and data transformation with the ability to conceptualize issues and develop well - reasoned resolutions. Understands set analysis

CORE COMPETENCIES:

  • Data Warehousing
  • Data Profiling and Cleansing
  • Integration and Extraction Tools
  • Logical and Physical Data Models
  • Data Modeling
  • Dimensional Modeling
  • Data Visualization
  • Team Leadership
  • Data Transformation
  • Data Mapping
  • Business Analysis
  • Designing Database Diagrams
  • SSIS data Integration
  • Extract, Transform and Load

PROFESSIONAL TRAITS:

  • Extensive experience in Software Development Life Cycle (SDLC) including System Analysis, Reporting, Data Modeling, Data Analysis and Business Analysis
  • Good experience in evaluating business systems for user needs, business modeling and document processing.
  • Strong background in designing various Logical and Physical Data Models using Erwin, MS Visio, ER Studio, Power Designer and toad data modeler.
  • Solid experience in Relational Modeling, Dimensional Modeling, Conceptual Modeling, Creating Logical Modeling, Physical Modeling, Data warehousing, Fact Tables, Dimension Tables, Star Schema and Snowflakes Schema as per enterprise standards.
  • Experience in developing data models which will serve both OLTP and OLAP functionality as per business needs.
  • Skilled in developing PL/SQL Scripts, stored procedures, Triggers, Views, and Indexes.
  • Hands on experience on various SQL Server Services like Integration Services (SSIS), Analysis Services (SSAS), and Reporting Services (SSRS) and Extract, Transform and Load (ETL) using Talend and Informatica Power Center
  • Expertisein python-based environment along with analytics, data wrangling and excel data extracts.
  • Strong experience in using python libraries: Beautiful Soup, Numpy, Scipy, Matplotlib, Pandas data frame, urllib2, MySQL dB for database connectivity
  • A data enthusiast with expertise in analyzing large datasets, creating tableau dashboards & providing actionable insights to executives and Expertise in developing Complex dashboards in creating chart visualizations in Tableau such as Geo Maps, Symbol Maps, Pie Charts, Bubble Charts, Bar Charts (Horizontal, Stacked), Tree Graphs, Scatter Plots, etc .
  • Creative thinker with the ability and experience to drill down from highest to lowest level of granularity, and propose unique methods to discern problems by using data mining approaches on the set of information available

TECHNICAL SKILLS:

Data Modeling Tools: Erwin r7/r7.1/7.2/r8.2/r9.1/9.5/9.6, Embarcadero ER/Studio, Enterprise Architect, Oracle Designer, and Sybase Power Designer

ETL Tools: Talend, Informatica 6.2/7.1, Data Junction, Ab-Initio, Data Stage, SSIS. Power Center 8.6 / 7.1.1, Informatica Designer, Workflow Manager, Work flow Monitor

Database Tools: Microsoft SQL Server, MySQL, Oracle, DB2, MS Access 2000 and Teradata V2R6.1

OLAP Tools: Microsoft Analysis Services, Business Objects and Crystal Reports

Other Tools: SAS Enterprise Guide, SAP ECC and Panorama Web Service

Packages: Microsoft office Suite, Microsoft Visual Studio, Microsoft Project 2010 and Microsoft Visio

Programing Languages: Python, R, SQL, TSQL, PL/SQL, Base SAS, HTML, XML, UNIX and Shell Scripting

Reporting Tool: Tableau, Power BI, QlikView v9+, SQL/Oracle, DW/BI Concept

CAREER PROGRESSION:

Confidential, Columbus, OH

Sr. Data Modeler / Data Analyst

Responsibilities:

  • Identify the business requirements by working with business users and create, maintain logical and physical data models using the IBM’s Infosphere Data Architect.
  • Designed and developed Data Marts by following Hybrid data modelling by combiningSnowflake Schema and Star Schema. Excellent experience and knowledge on data warehouse concepts and dimensional data modelling using Ralph Kimball methodology
  • Actively performed Data Analysis, Data Profiling & Validity, Data cleansing rules etc. using Informatica Data Quality (IDQ).
  • Data analysis on policies, insurance holders, agreements, claims, transactions, etc. subject area connections and providing solutions to resolve the gaps in the dataflow or leading the discussions with the associated stakeholders and SME’s to resolve the gaps.
  • Worked extensively on forward and reverse engineering processes. Created DDL scripts for implementing Data Modeling changes.
  • Build efficient SSIS packages for processing fact and dimension tables with complex Transforms and type 1 and type 2 slowly changing dimensions
  • Performed data mining and gap analysis on data sources and destination systems to ensure data accuracy.
  • Creating database objects such as views, tables, stored procedures and indexes
  • Skillfully defined the list codes and code conversions between the Source Systems and Data Mart as well as actively involved in extensive data analysis on Teradata and Oracle Systems querying and writing in SQL.
  • Developed mapping spreadsheets; provided the Data Warehouse Development (ETL) team with source to target Data Mapping.
  • Created source and target table definitions using Informatica. Source data was extracted from Flat files, SQL Server and Netezza Database
  • Created SQL Jobs to schedule Informatica power center packages. Used data profiling task in Informatica IDQ to identify poor data and repair it
  • Applied conditional formatting in Tableau to highlight key areas in the report data and Used Report Manager to assign roles, permissions and to create report schedules.
  • Used Sub Reports, Graphing, Data Drill-Down, and Data sorting/grouping from OLTP and OLAP data sources and created various dynamic reports using Tableau
  • Written Python scripts to connect-extract-stage as a part of data loading process for web-based API calls
  • Used Python standard library and pandas for Data Wrangling: Clean, Transform, Combining and Merging data sets, restructuring data
  • Used Python scripts to update content in the database and manipulate files
  • Wrote and executed various RDBMS, MYSQL database queries from Python using Python-MySQL connector and MySQL dB package
  • Created visualizations using custom hierarchies, groups and merged data from multiple sources using the data blending technique with in the tableau workbook.
  • Developed visually compelling reports and interactive dashboards and done forecasting.
  • Involved in creating Test Data for evaluating in and out boundaries of Testing. Involved in fixing the defects raised by QA Team.

Confidential, Lowell, AR

Sr. Data Modeler / Data Analyst

Responsibilities:

  • Created logical, physical models according to the requirements and physicalize them.
  • Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
  • Conduct data analysis, mapping transformation, Data Modeling and data-warehouse concepts
  • Understanding business/functional requirements and issues and translate them into technical/data mapping rules/issues
  • Conduct detailed data mapping including complex data transformation rules, record building or join conditions and reference data translations
  • Conduct detailed source and target data analysis using Informatica Data or Database SQL/PL-SQL
  • Modeling relational and dimensional data models using Erwin data modeler design tool and very good understanding of DW concepts, conceptual, logical and physical data Modeling
  • Performed Reverse engineering of the source systems using Erwin data modeler.
  • Involved in capturing Data Lineage, Table and Column Data Definitions, Valid Values and others necessary information in the data models.
  • Identified Facts & Dimensions Tables and established the Grain of Fact for Dimensional Models.Generate the DDL of the target data model and attached it to the Jira to be deployed in different Environments.
  • Worked closely with end users in identifying and implementing right set of business criteria to convert raw data into meaningful information.
  • Defined best practices for Tableau report development. Reviewed basic SQL queries and edited inner, left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
  • Analysed the JSON, XML files using XML SPY. Involved in integrating the JSON files to API.
  • Worked collaboratively with ETL team and QA team to ensure system deliverables align with business requirements with measurable results.
  • Deploy the DDL in the Development environment and required database.
  • Worked closely with Data Architect to review all design models with respect to functions, definition, maintenance review and support Data analysis, Data Quality and ETL design that feeds the logical data models of new the EDW.

Confidential, Atlanta, GA

Sr. Data Modeler / Data Analyst

Responsibilities:

  • Captured requirements from business and technical staff to analyse data requirements and did appropriate data analysis on source data
  • Documented database solutions and white boarding solutions to project teams, Data Engineering team and architecture review boards.
  • Integrated data into existing enterprise logical model and physical data stores to avoid data redundancy.
  • Created source to target mapping documents showing data movement, in partnership with the sourcing system to consuming system developers
  • Implemented data solutions to satisfy business requirements and provide guidance to developer on approach to design the source to target mapping.
  • Analyse the source file layout and file formats and capture all necessary information and documented in the source to target mapping document
  • Conduct detailed data analysis and produce quality data work products
  • Understanding business/functional requirements and issues and translate them into technical/data mapping rules/issues
  • Conduct detailed data mapping including complex data transformation rules, record building or join conditions and reference data translations
  • Designed Relational Data Modeling with emphasis on the developing Star Schemas to support robust data analytics (Kimball Dimensional Modeling)
  • Designed and build operational reports/dashboards using tools such as Tableau and Microsoft Power BI to support more efficient reporting, analyzing, planning, and forecasting
  • Leveraged on a variety of input sources (e.g., databases, flat files, custom applications) to populate business intelligence solutions that drive better decision-making and prioritized within the organization

Confidential, San Antonio, TX

Data Modeler/ Data Analyst

Responsibilities:

  • Creatively deve loped Use Cases using MS Visio, a detailed project plan, Conceptual Data Models, Logical Data Models and transformed to create schema by using Erwin; created reports by dragging/extracting data from cube and wrote mdx scripts
  • Developed and implemented Test Strategies using the Test Director as well as developed the Data Mart for base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database
  • Designed Logical Data Models and Physical Data Models using Erwin; performed Forward Engineering of the Data Models, Reverse Engineering on the existing Data Models and updated the data models
  • Implemented Forward Engineering by using DDL scripts and generated indexing strategies to develop the logical data model using Erwin
  • Created Talend Mappings to populate the data into dimensions and fact tables and Created ETL/Talend jobs both design and code to process data to target databases.
  • Read .CSV and fixed width files and created multi schema output files using Talend Open Studio.
  • Created .CSV files from excel spreadsheets and loaded into the target Oracle Database using Talend Open Studio.
  • Created ad-hoc reports for the upper level management using stored procedures and MS SQL Server Reporting Services (SSRS) in accordance with the business requirements
  • Created various kinds of reports using Power BI and Tableau based on the client's needs.
  • Created multiple kind of Report in Power BI and present it using Story Points and created Dashboards Scorecards, views, pivot tables, charts, for further data analysis.
  • Migrated existing data into the Power BI. Worked with the application developers and provide necessary SQL Scripts using T-SQL.
  • Created joins and sub-queries for complex queries involving multiple tables.
  • Responsible for creating SQL datasets for Power BI and Ad-hoc Reports. Used SSIS to create ETL packages to validate, extract, transform and load data to data warehouse databases, data mart databases, and process SSAS cubes to store data to OLAP databases.
  • Created dynamic reports that would best suit the needs of the technical architectural development process and proactively improved the performance of reports on ad-hoc basis
  • Documented, analysed and defined the source to target data mappings and business rules to eliminate redundant, inefficient processes and practices

Confidential

Data Engineer/Data Management Analyst

Responsibilities:

  • Defined corporate Metadata definitions for all Enterprise Data Supported databases and performed Data Analysis, Data Profiling and Validity, and Data Cleansing
  • Developed Mapping Spreadsheets as well as Logical and Physical Data Models; defined and developed designs to support departmental and functional Data Marts
  • Worked closely with Analytical and Reporting infrastructure teams to ensure operational and analytical data warehouses could support all business requirements for OLAP reporting across the organization
  • Working across many Transformation & Regulatory Programmes
  • Was fundamental towards driving the creation of key assets from business data architecture through to the logical data architecture to provide traceability through to the Physical Domains and Business Functions
  • Articulated content and modelled principles to other modelers and business users
  • Developed models to reflect business area-specific data needs
  • Supported review, signed-off and adopted models in collaboration with global businesses
  • Created SSIS packages to populate data from various data sources.
  • Created ad-hoc reports using Power Pivots add-in
  • Done proof of concepts using Power BI tools (Power Query/ Power View) for excel. Created SSIS packages to extract data from OLTP to OLAP system and created alerts for successful and unsuccessful completion of schedule jobs.

Confidential

Jr. B usiness Analyst

Responsibilities:

  • Documented Business Plan and Test Plan based on gathered requirements.
  • Designed and executed test plans, tracked defects and got them resolved to ensure that business requirements and functional specifications are tested and fulfilled. Developed Conceptual, Logical and physical data model and scripts to create users, roles and schema objects
  • Transformed project data requirements into project data models.
  • Performed Data profiling & data quality assessment from the SOR files.
  • Responsible as a member of development team to provide business data requirements analysis services, producing logical and Physical data models.
  • Demonstrated a clear understanding of each logical data model and articulated the intent and vision of each model through consumer and supplier model walkthroughs
  • Made business recommendations based on data collected to improve business efficiency.
  • Implemented functionalities of application according to business requirements through stored procedures, functions and triggers.
  • Closely worked with ETL process development team.
  • Worked within in an agile or iterative environment. Collaborated with a large set of stakeholders across the enterprise to understand in detail consumer requirements, research industry data modeling standards, define strategies for standardization of Critical Data Elements and propose appropriate models
  • Perform Data Analysis on both source data and target data after transfer to Data Warehouse.
  • Utilized SQL to develop stored procedures, views to create result sets to meet varying reporting requirements.
  • Used Visual Studio report builder to design report of varying complexity and maintain system design documents.
  • Data analysis, reporting using SSRS.
  • Used Visual Studio report builder to design report of varying complexity and maintain system design documents.

Confidential

Oracle Developer/Associate Business Analyst

Responsibilities:

  • Created triggers, stored procedures, views and SQL-Scripts.
  • Incorporated changes to the existing Packages, triggers, procedures and functions to cater to the new functionality.
  • Created PL/SQL procedures to aid business functionalities like bidding and allocation of inventory to the shippers.
  • Worked closely the team of business analysts, quality analysts to understand the business requirements and developed the Business Object Model (BOM).
  • Part of a team responsible for impact analysis and the analysis and design of the system.
  • As a developer, created Program Specifications and Unit Test cases for the various modules to be developed.
  • Created batch program for scheduling the processes of data loading between various tables.
  • Developed User entry screens to enter data pertaining to inventory and trading modules.
  • Reports for the various modules were developed using Reports 2.5.
  • Developed the logical and physical model from the conceptual model developed using a tool Erwin by understanding and analysing business requirements.
  • Validated the modifications to be performed on the existing database to ensure the efficiency and its performance.
  • Recommended required corrections by examining the new application proposal.
  • Assigned different roles, granted/revoked permissions to the database users.
  • Maintained documentation using Crystal Reports.
  • Provided Unit testing, SIT and UAT testing support and deployment support
  • Worked on PL/SQL and UNIX shell to process ETL such as data loading, cleansing, file management

Hire Now