We provide IT Staff Augmentation Services!

Sr. Data Modeler Resume

4.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • Over 9+ Years of Experience in Data Analysis/Data Modeler, Report Generation, Maintenance of Business Report Processes and Data Verifications and Validations.
  • Good knowledge in Big Data - Hadoop, HBase, Cassandra, AWS Cloud, Amazon Redshift, AWS EC2, AWS EC3, MongoDB.
  • Experience in using Oracle, SQL*PLUS , and SQL*Loader .
  • Experience in automating and scheduling the Informatica jobs using UNIX shell scripting configuring Korn-jobs for Informatica sessions.
  • Experienced in Interacting with Users, analyzing client business processes, Documenting business requirements, Performing Design Analysis and Developing Design Specifications.
  • Experienced working with data modeling tools like Erwin, Power Designer and ER Studio.
  • Strong experience in Business and Data Analysis, Data Profiling, Data Migration and Data Integration.
  • Experienced with different Relational databases like Teradata, Oracle, SQL Server and MS Access.
  • Excellent understanding of Data Warehousing Concepts - Star and Snowflake schema, SCD Type1/Type2, Normalization/De-Normalization, Dimension & Fact tables.
  • Proficient experience as a Data Analyst in gathering data from different sources, data profiling, data definition, and loaded the data to business warehouse.
  • Experience in conducting Joint Application Development (JAD) sessions for requirements gathering, analysis, design and Rapid Application Development (RAD) sessions
  • Expertise in broad range of technologies, including business process tools such as Microsoft Project, MS Excel, MS Access, MS Visio
  • Experience in testing and writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Extensive SQL experience in querying, data extraction and data transformations.
  • Experienced in writing numerous SQL Queries and Performing ad-hoc analysis.
  • Experienced in developing business reports by writing complex SQL queries using views, macros, volatile and global temporary tables
  • Experienced in Automating and Scheduling the Teradata SQL Scripts in UNIX using Korn Shell scripting.
  • Created pivot tables and charts using worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot tables.
  • Strong Experience in working on MS Access, MS Excel and MS Power Point
  • Experienced in handling all the domain and technical interaction with application users, analyzing client business processes, documenting business requirements.
  • Experience on SAS setup, coding, development, analysis, debugging, implementation and maintenance as an Analyst with experience in Windows or UNIX.
  • Excellent experience in Data mining with querying and mining large datasets to discover transition patterns and examine financial data.
  • Created various dashboards using Tableau, Excel, and Access with a focus on user interface and simple data consumption
  • Experience in Business Intelligence tools like Business Objects, Cognos and OBIEE
  • Experience in Data Transformation, Metadata, and Data dictionary, Data Loading, Modeling and Performance Tuning.
  • Experience with creating reports using Business Objects.
  • Excellent understanding of Microsoft BI toolset including Excel, Power BI, SQL Server Analysis Services, Visio, Access.
  • Strong knowledge of database management in writing SQL queries using complex joins, grouping, aggregation, nested sub queries, and resolving key performance issues.

TECHNICAL SKILLS:

Analysis and Modeling Tools: Erwin 9.6/9.5, Sybase Power Designer, ER/Studio V17

Big Data: Hadoop, SQOOP, Hive, Cassandra, HBase, MapReduce.

Cloud Architecture: Amazon AWS, Redshift & MS Azure

Database Tools: Microsoft SQL Server2016/2014, Teradata 15/14, Oracle 12c/11g, MS Access, PostgreSQL, Netezza.

Languages: UNIX Shell Scripting, HTML, T-SQL, Data Structure, Algorithms.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects

ETL Tools: SSIS, SSRS, Informatica Power 9.6, SAP Business Objects XIR3.1/XIR2, Web Intelligence

Operating System: Windows, Linux, Unix

Reporting Tools: Business Objects, Crystal Reports8/9.0

Tools: & Software's: TOAD, MS Office, BTEQ, Teradata SQL Assistant

Other tools: SQL PLUS, SQL LOADER, MS Project, MS Visio, UNIX, PL/SQL etc

Project Execution Methodologies: Agile, SDLC, Ralph Kimball and Bill Inmon, RUP, RAD, JAD

WORK EXPERIENCE:

Confidential, Charlotte, NC

Sr. Data Modeler

Responsibilities:

  • Worked with SMEs, Business Analysts and Technology teams to understand the data requirements and the full attribute set for entities and dimensions.
  • Converted business processes, domain specific knowledge, and information needs into a conceptual model.
  • Created semantically rich logical data models (non-relational/No SQL) that define the Business data requirements.
  • Converted conceptual models into logical models with detailed descriptions of entities and dimensions for Enterprise Data Warehouse.
  • Extensively used agile methodology as the Organization Standard to implement the data Models.
  • Identified data organized into logical groupings and domains, independent of any application or system.
  • Provided centralized direction for metadata repositories, data definitions, and data relationships.
  • Developed and maintain fully defined conceptual, logical and physical dimensional data models to ensure the information models are capable of meeting end user and developer needs
  • Developed data models and data migration strategies utilizing sound concepts of data modeling including star schema, snowflake schema.
  • Developed Model aggregation layers and specific star schemas as subject areas within a logical and physical model.
  • Documented data models including a fully normalized version.
  • Provided data harmonization, master data development, data governance, data quality and metadata management.
  • Established measures to chart progress related to the completeness and quality of metadata for enterprise information.
  • Supported reduction of data redundancy and fragmentation and elimination of unnecessary movement of data and to improve data quality.
  • Developed the data dictionary for various projects for the standard data definitions related data analytics.
  • Generated DDL scripts from physical data models for implementation to the database platform.
  • Conducted data modeling for JAD sessions and communicated data related standards.
  • Facilitate ETL design efforts to provide direction and support implementation of ETL designs.
  • Investigate Data Quality Issues and provide recommendation & solution to address them.
  • Optimized and updated UML Models (Visio) and Relational Data Models for various applications.
  • Coordinate with Data Architects to Design Big Data, Hadoop projects and provide for a designer that is an idea-driven.
  • Used SQOOP for Data ingestion in Hadoop.
  • Worked on NoSQL Databases as Cassandra
  • Configured Hadoop Ecosystems to read data transaction from HDFS and Hive
  • Prepared reports to summarize the daily data quality status and work activities.
  • Performed ad-hoc analyses, as needed, with the ability to comprehend analysis as needed.
  • As a part of my functional responsibility, worked with Project Managers and Business users to gather requirements.

Environment: Erwin 9.7, No SQL, Sqoop, Cassandra 3.11, Hadoop 3.0, HDFS, Hive 2.3

Confidential, Eden Prairie, MN

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Updated existing models to integrate new functionality into an existing application.
  • Conducted one-on-one sessions with business users to gather data warehouse requirements.
  • Developed normalized Logical and Physical database models to design OLTP system
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Power Designer
  • Applied a dimensional model structure to archive an Agile data model..
  • Created DDL scripts for implementing Data Modeling changes. Created Power Designer reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs ' to apply the data model changes.
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model .
  • Maintaining and implementing Data Models for Enterprise Data Warehouse using Power Designer .
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Create and maintain Meta data , including table, column definitions
  • Worked with Data base Administrators, Business Analyst s and Content Developers to conduct design reviews and validate the developed models.
  • Worked on NoSQL databases including HBase, Mongo DB, and Cassandra. Implemented multi-data center and multi-rack Cassandra cluster.
  • Worked with MDM systems team with respect to technical aspects and generating reports.
  • Working on AWS provisioning EC2 Infrastructure and deploying applications in Elastic load balancing.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked on PL/SQL collections, index by table, arrays, bulk collect, FOR ALL, etc.
  • Responsible for defining the naming standards for data warehouse.
  • Possessed strong Documentation skills and knowledge sharing among Team, conducted data modeling review sessions for different user groups, participated in sessions to identify requirement feasibility.
  • Worked on data warehousing , ETL , SQL, scripting and big data ( MPP + Hadoop )
  • Wrote Hadoop and Pig/Hive scripts for data processing
  • Extensive experience in PL/SQL programming - Stored Procedures, Functions, Packages and Triggers
  • Massaged the existing model to create new logical and physical models that formed the basis for the new application.
  • Used Power Designer for reverse engineering to connect to existing data base and ODS to create graphical representation in the form of Entity Relationships and elicit more information
  • Developed Data Migration and Cleansing rules for the Integration Architecture ( OLTP , ODS, DW ).
  • Identified the most appropriate data sources based on an understanding of corporate data thus providing a higher level of consistency in reports being used by various levels of management.
  • Verified that the correct authoritative sources were being used and that the extract, transform and load ( ETL ) routines would not compromise the integrity of the source data .

Environment: Power Designer 16.6, UNIX, Oracle 12c, Teradata V15, Agile Informix, MS Excel 2010, Mainframes MS Visio, Rational Rose, Requisite Pro, Tableau, AWS EC2.

Confidential, Edison, NJ

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Designed and maintenance of conceptual and logical data model for OLTP and other analytical applications using Erwin.
  • Analyzed functional and non-functional business requirements and translate into technical data requirements and create or update existing logical and physical data models.
  • Facilitated data modeling sessions ranging from information gathering to model validation.
  • Utilize data modeling phase methodology leveraging conceptual, logical and physical development to arrive Confidential model solutions.
  • Reviewed and created documents and diagrams using Data models, Data mapping, Data dictionaries and Metadata.
  • Developed and maintain metadata definitions for sourcing data stores and all evolutions of the Data Lake from ingested sources to consumers.
  • Provided thought leadership to collecting data, staging data, cleansing data, standardizing data and publishing data into data marts.
  • Performed Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
  • Developed data models for different class of ODS systems.
  • Participated in business process modeling, worked with the developers with respect to data mapping and validation of the data's life cycle.
  • Design the dimensional data model for the Data Lake, EDW and MDM solutions.
  • Ensure reuse of the conform dimensions and promote the use of common data objects.
  • Worked with the ETL and MDM teams to ensure correct implementation of the data models, data quality and validation rules.
  • Worked on Data governance, data quality, data lineage establishment processes.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Collaborate with architects and data owners to define efficient ETL and MDM strategy utilizing Informatica capabilities.
  • Worked on PL/SQL collections, index by table, arrays, bulk collect, FOR ALL, etc.
  • Used data model repository using Model Mart and drive the team to use the same for keeping master models repository.
  • Performed reverse engineering of physical data models from databases and SQL scripts.
  • Worked on Unit Testing for three reports and created Sql Test Scripts for each report as required.
  • Configured & developed the triggers, workflows, validation rules & having hands on the deployment process from one sandbox to other.
  • Extensively used Erwin as the main tool for modeling along with Visio
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Performed data cleaning and data manipulation activities using NZSQL utility.
  • Created and developed the stored procedures, triggers to handle complex business rules, history data and audit analysis.
  • Directed and overseeing data quality tests, including providing input to quality assurance team members.

Environment: Erwin 9.6, OLTP, PL/SQL, MS Visio, NZSQL, MDM, ETL, Informatica 9.5

Confidential, Northbrook, IL

Sr. Data Analyst

Responsibilities:

  • Lead research projects on solutions to data problems; advise on methods to improve data related business processes; present results to senior management.
  • Worked with various Business users to gather reporting requirements and understand the intent of reports and attended meetings to provide updates on status of the projects.
  • Direct the development of Entity-relationship diagram (ERD) and Data Flow Diagram (DFD), user experience and configuration elements of solution design.
  • Coordinate and support proposals, feasibility studies, and new business development.
  • Direct the development of data dictionary for complex database and projects.
  • Lead development of complex database query, intermediate level DB procedures, functions or packages to pull relevant data for analysis.
  • Metadata
  • Created database objects like Procedures, Functions, Packages, Triggers, Indexes and Views using T-SQL in Development and Production environment for SQL Server.
  • Testing responsibilities included unit testing, integration testing, and business acceptance testing.
  • Worked on normalization techniques Normalized the data into 3rd Normal Form (3NF).
  • Produce complex reports, complex dimensional models, data dictionary, physical and Logical data model.
  • Translated the requirements into Business (Functional) and System (Technical) requirements and review with the customer to get their approval.
  • Wrote complex SQL queries to analyze and understand data.
  • Generated reports from SAS, Business Objects or Crystal reports depending on the customer requirement.
  • Write SQL queries to create mock up reports as per the business requirements and got user approval.
  • Validated the data during UAT testing.
  • Worked on analysis of Master Data Management (MDM) for Data Sourcing.
  • Wrote business rules in source to target data mapping document to make it easy and clear for developers and testers.
  • Performed data analysis, data governance and data profiling using Informatica on various source systems.
  • Involved in developing ad-hoc reporting for various sales operations for different customers using Tableau dashboards.
  • Assisted my team members in querying complex data bases (customer) to prepare meaningful data sets for Tableau reporting.
  • Worked with the Business Analysts and the QA team for validation and verification of the development.
  • Created pivot tables and charts using worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot tables.
  • Developed numerous PROC SQL Queries by creating SET or MULTISET Tables, Views, Volatile Tables, using Inner and Outer Joins.
  • Utilize advanced SQL optimization techniques using INDEX, COLLECT STATS and PARTITION to optimize data.
  • Created a flat script with all the required tables in a report by eliminating duplicates and by using several point in time tables by joining on snap dates.

Environment: SQL, T-SQL, SAS, Informatica, 3NF, Crystal reports, MS Excel, Tableau, Sql Scripts, ad-hoc, MS Visio.

Confidential

Data Modeler/Data Analyst

Responsibilities:

  • Involved in requirement gathering along with the business analysts group.
  • Gathered all the report prototypes from the business analysts belonging to different Business units.
  • Gathered various requirement matrices from the business analysts.
  • Participated in Joint Application Development(JAD) sessions
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Power Designer.
  • Conducted Design discussions and meetings to come out with the appropriate Data Model
  • Designed for the various reporting requirements
  • Designed a logical data model using database tool Power Designer.
  • Used Power Designer to create logical and physical data models for enterprise wide OLAP system.
  • Developed monthly summary and downstream data marts from enterprise wide databases in accordance with reporting requirements with dimensions like time, customers, services and accounts.
  • Used Normalization (1NF, 2NF & 3NF) and De-normalization techniques for effective performance in OLTP and OLAP systems.
  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Developed Star and Snowflake schemas based dimensional model to develop the data warehouse
  • Modeled the dimensions and facts using Erwin for centralized data warehouse.
  • Identified and tracked slowly changing dimensions and determined the hierarchies in dimensions
  • Actively participated in data mapping activities for the data warehouse.
  • Created summary tables using de-normalization technique to improve complex join operations.
  • Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Collected, analyze and interpret complex data for reporting and/or performance trend analysis
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis pertaining to Loan products.
  • Participated in the tasks of data migration from legacy to new database system.
  • Worked on Metadata exchange among various proprietary systems using XML.
  • Conducted Design reviews with the business analysts, content developers and DBAs.
  • Designed and implemented a physical data model to handle marketing strategies and to satisfy reporting needs dynamically.
  • Organized User Acceptance Testing (UAT), conducted presentations and provided support for Business users to get familiarized with Loan products application.
  • Handled performance requirements for databases in OLTP and OLAP models.
  • Worked with the Implementation team to ensure a smooth transition from the design to the implementation phase.

Environment: Power Designer, SQL Server 2008, Windows XP, Oracle 10g, UML, SQL Loader, OBIEE.

We'd love your feedback!