We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

3.00/5 (Submit Your Rating)

Eden Prairie, MN

PROFESSIONAL SUMMARY:

  • Above 10+ years of experience in Data Architect and Data Modeling, Data Development, Implementation and Maintenance of databases and software applications.
  • Experience in designing Star schema, Snowflake schema for Data Warehouse, ODS architecture.
  • Hands on experience in SQL queries and optimizing the queries in Oracle, SQL Server, DB2, and Netezza & Teradata.
  • Experience in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
  • Good experience in working with different ETL tool environments like SSIS, Informatica and reporting tool environments like SQL Server Reporting Services (SSRS), Cognos and Business Objects.
  • Knowledge and working experience on big data tools like Hadoop, Azure Data Lake, AWS Redshift.
  • Hands on experience in Normalization (1NF, 2NF, 3NF and BCNF) Denormalization techniques for effective and optimum performance in OLTP and OLAP environments.
  • Experience with Requirements gathering, interacting with users, analyzing client business processes, documenting business requirements and conducting Joint Application Design (JAD) sessions to discuss overall design patterns.
  • Good experience in Data Architect using SQL Queries and Data profiling tools.
  • Experience in designing the data mart and creation of cubes.
  • Experience in developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems (ODS, ADS and Data Marts).
  • Good experience in using SSRS and Cognos in creating and managing reports for an organization.
  • Solid knowledge of Data Marts, Operational Data Store (ODS), OLAP, Dimensional Data Modeling with Ralph Kimball Methodology (Star Schema Modeling, Snow - Flake Modeling for FACT and Dimensions Tables) using Analysis Services.
  • Expertise in Data Architect, Data Modeling, Data Migration, Data Profiling, Data Cleansing, Transformation, Integration, Data Import, and Data Export through the use of multiple ETL tools such as Informatica Power Centre.
  • Experience in designing, building and implementing complete Hadoop ecosystem comprising of Map Reduce, HDFS, Hive, Impala, Pig, Sqoop, Oozie, HBase, MongoDB, and Spark.
  • Experience with Client-Server application development using Oracle PL/SQL, SQL PLUS, SQL Developer, TOAD, and SQL LOADER.
  • Strong experience with architecting highly per formant databases using PostgreSQL, PostGIS, MySQL and Cassandra.
  • Extensive experience in using ER modeling tools such as Erwin and ER/Studio, Teradata, BTEQ, MLDM and MDM
  • Good knowledge of problem solving and analytical skills with exceptional ability to learn and master new technologies efficiently.
  • Extensive experience in shell scripting Python, Perl, Ruby, or any other scripting language.
  • Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
  • Experience in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.
  • Used Erwin, created Conceptual, Logical and Physical data models.
  • Experience in BI/DW solution (ETL, OLAP, Data mart), Informatica, BI Reporting tool like Tableau and QlikView and also experienced leading the team of application, ETL, BI developers, Testing team.

TECHNICAL SKILLS:

Big Data technologies: HBase, HDFS, Sqoop, Spark, Hadoop, Hive

Cloud Managment: Amazon Web Services(AWS), Redshift

Data Modeling Tools: ER/Studio V17, Erwin 9.6/9.5, Power Sybase Designer.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Cloud Platform: AWS, Azure, Google Cloud, Cloud Stack/Open Stack

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Testing and defect tracking Tools: HP/Mercury, Quality Center, Win Runner, MS Visio & Visual Source Safe

Operating System: Windows, Unix, Sun Solaris

ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, Tableau, Pentaho.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

WORK EXPERIENCE:

Confidential - Eden Prairie, MN

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Developing full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Interacted with Business Analyst, SMEs and other Data Architects to understanding Business needs and functionality for various project solutions
  • Owned and managed all changes to the data models. Created data models, solution designs and data architecture documentation for complex information systems.
  • Installation and Configuration of other Open Source Software like Pig, Hive, HBase and Sqoop.
  • Designed ER diagrams (Physical and Logical using Erwin) and mapping the data into database objects.
  • Produced Logical /Physical Data Models.
  • Part of team conducting logical data analysis and data modeling JAD sessions, communicated data-related standards.
  • Applied a dimensional model structure to archive an Agile data model.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Interacted with users and business analysts to gather requirements.
  • Worked on designing a Star schema for the detailed data marts and plan data marts involving confirmed dimensions.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
  • Designed the data warehouse architecture for all the source systems using MS Visio.
  • Worked with team to document the transformation rules for data migration from OLTP to Warehouse environment for reporting purposes.
  • Responsible for different Data mapping activities from Source systems to Teradata.
  • Working with Data Architects to design New Data Mart to design the Google Analytics data reports.
  • Developed and maintained an Enterprise Data Model (EDM) to serve as both the strategic and tactical planning vehicles to manage the enterprise data warehouse. This effort involves working closely with the business.
  • Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
  • Worked with MDM systems team with respect to technical aspects and generating reports.
  • Create and maintain data model/architecture standards, including master data management (MDM).
  • Involved in Data Architecture, Data profiling, Data analysis, data mapping and Data architecture artifacts design.
  • Involved in end to end implementation of Big data design.
  • Developing Database Architecture using design standards and tools including Erwin, IBM Info Sphere DB2
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Worked on PL/SQL collections, index by table, arrays, bulk collect, FOR ALL, etc.
  • Create and communicate the strategy and use of Big Data.
  • Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
  • Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server with high volume data.
  • Testing responsibilities included unit testing, integration testing, and business acceptance testing.
  • Applied appropriate level of abstraction in designs and confirmed that Data designs support the integration of data and information flow across systems and platforms.
  • Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
  • Trained junior data modelers on data architecture standards and data modeling standards.
  • Created views and dashboards on end client's data. Produced powerful dashboards telling story behind the data in an easy to understand format such as pie, bar, geo, and line charts that are viewed daily by senior Management.

Environment: Agile, SQL, AWS, Redshift, MS Visio, MDM, NoSQL, HBase, MongoDB, Cassandra cluster, Erwin9.7

Confidential - Hillsboro, OR

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Developed OLTP system by designing Logical and eventually Physical Data Model from the Conceptual Data Model.
  • Maintained Referential Integrity by Introducing foreign keys and normalized the existing data structure to work with the ETL team and provided source to target mapping to implement incremental, full and initial loads into the target data mart.
  • Worked on normalization techniques. Normalized the data into 3rd Normal Form (3NF).
  • Arranged various guiding sessions for Programmers, Engineers, System Analysts and others for clarification of performance requirements, interfaces project capabilities and limitations.
  • Created the best fit Physical Data Model based on discussions with DBAs and ETL developers.
  • Created conceptual, logical and physical data models, data dictionaries, DDL and DML to deploy and load database table structures in support of system requirements.
  • Analyzed and presented the gathered information in graphical format for the ease of business managers.
  • Ability to document activities and communicate with Data Architects as to status of Physical Data Modeling activities.
  • Produced Source to target data mapping by developing the mapping spreadsheets.
  • Identified required dimensions and Facts using Erwin tool for the Dimensional Model.
  • Validated business data objects to ensure the accuracy and completeness of the database.
  • Represented existing business models by UML diagrams.
  • Ability to recognize and resolve discrepancies between executed DDL and the physical data model using complete compare
  • Used Erwin tool to develop a Conceptual Model based on business requirements analysis.
  • Implemented Snow-flak schema to ensure no redundancy in the database.
  • Implemented Forward Engineering by using DDL scripts and generating indexing strategies to develop the logical data model using Erwin.
  • Reverse Engineered physical data models from SQL Scripts and databases.
  • Optimized and updated UML Models (Visio) and Relational Data Models for various applications.
  • Supported SalesForce.com maintenance with services such as periodic data cleansing and workflow.
  • Implemented ETL techniques for Data Conversion, Data Extraction and Data Mapping for different processes as well as applications.
  • Maintained Data Consistency by evaluating and updating logical and physical data models to support new and existing projects.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
  • Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Reviewed different codes and designs to ensure no ERRORS in the systems and to recommend required update if needed.
  • Identified and tracked the slowly changing dimensions (SCD I, II, III & Hybrid/6) and determined the hierarchies in dimensions.
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Using Informatica & SAS to extract transform & load source data from transaction systems.
  • Well experienced in documenting data relationships, business rules, allowed rules, evolved glossary and codes.

Environment: OLTP, DBAs, ETL, DDL, DML, Erwin 9.6, UML, diagrams, Snow-flak schema, SQL, Data Mapping, Metadata, OLTP, SAS, Informatica 9.5

Confidential - Manchester, NH

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Developed data Mart for the base data in Star Schema, Snow-Flake Schema involved in developing the data warehouse for the database.
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required
  • Extensively used Erwin as the main tool for modeling along with Visio
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark
  • Trained Spotfire tool and gave guidance in creating Spotfire Visualizations to couple of colleagues
  • Developed Contracting Business Process Model Workflows (current / future state) using Bizagi Process Modeler software.
  • Analyzed the business requirements by dividing them into subject areas and understood the data flow within the organization
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable
  • Worked on Unit Testing for three reports and created Sql Test Scripts for each report as required.
  • Configured & developed the triggers, workflows, validation rules & having hands on the deployment process from one sandbox to other.
  • Created automatic field updates via workflows and triggers to satisfy internal compliance requirement of stamping certain data on a call during submission.
  • Extensively used Erwin as the main tool for modeling along with Visio
  • Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
  • Worked on Metadata Repository (MRM) for maintaining the definitions and mapping rules up to mark.
  • Performed Forward Engineering the Data models, Reverse Engineering on the existing Data Models and Updates the Data models.
  • Performed data cleaning and data manipulation activities using NZSQL utility.
  • Created and developed the stored procedures, triggers to handle complex business rules, history data and audit analysis.
  • Worked in importing and cleansing of data from various sources like Teradata, flat files, SQL Server 2005 with high volume data.
  • Analyzed the physical data model to understand the relationship between existing tables. Cleansed the unwanted tables and columns as per the requirements as part of the duty being a Data Analyst.
  • Analyzed and understood the architectural design of the project in a step by step process along with the data flow
  • Created DDL scripts for implementing Data Modeling changes.
  • Created Erwin reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBAs' to apply the data model changes.

Environment: Erwin 9.5, Teradata 15, SSIS, Business Objects, SQL Server 2016, ER/Studio Windows XP, MS Excel.

Confidential - Chicago, IL

Sr. Data Modeler

Responsibilities:

  • Working on the enterprise level modules for the insurance client to perform the data modeling and data architectural related tasks and providing solutions for the data migration on various platforms from hierarchical to relational databases to unstructured databases.
  • Working on the OLAP for data warehouse and data mart developments using Ralph Kimball methodology as well as OLTP models, both and interacting with all the involved stakeholders and SME's to derive the solution.
  • Worked with Business Analyst during requirements gathering and business analysis to prepare high level Logical Data Models and Physical Data Models using E/R Studio.
  • Analyzed the reverse engineered Enterprise Originations (EO) physical data model to understand the relationships between already existing tables and cleansed unwanted tables and columns as part of Data Analysis responsibilities.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Created data trace map and data quality mapping documents.
  • Created Use Case Diagrams using UML to define the functional requirements of the application.
  • Provided a consultative approach with business users, asking questions to understand the business need and deriving the data flow, logical, and physical data models based on those needs.
  • Create conceptual, logical and physical models for OLTP, Data Warehouse, Data Vault and Data Mart, Star/Snowflake schema implementations.
  • Working on facts and dimensions for the dimensional modeling data architecture as well as relational data modeling for the transactional systems like in-house applications and policy, claims applications, etc.
  • Working on policy and claims applications to model the tables according to the 3NF modeling and working on resolving many to many relationships, bridge tables, reference tables, master data, etc.
  • Lead the modeling efforts on multiple enterprise level projects.
  • Created logical and physical models and ER diagrams using Power Designer modeling tool for the relational and dimensional data modeling.
  • Working on forward and reverse engineering the DDL for the SQL Server, DB2 and Teradata environments.
  • Collected technical and business metadata and maintained naming standards by working along with the architects, data governance, business analysts and developers, SME's, etc.
  • Created PL/SQL packages and Database Triggers and developed user procedures and prepared user manuals for the new programs.
  • Interacted with the solutions architects for the framework analysis for multiple modules like claims, policy, etc. for timely delivery of the data into the new system for transactional analysis for .NET framework team and then to redirect the output to the reporting platform on to the Teradata.
  • Lead the interactions as a data modeler with the reporting team, BI specialist's, etc. to analyze requirements and the data from the transactional systems to get migrated into the data warehouse by the ETL teams using Datastage and model the structure according to the business requirements to be reported from Micro strategy.
  • Providing solutions for the business issues and overcoming the technical difficulties by providing the optimum model to solve those issues to satisfy all the stakeholders involved.
  • Working on multiple area's like Auto, policy, claims, life insurance, etc.
  • Interacting with the business analysts, architects, ETL teams using Data stage to build a sophisticated modeling solution for the customers.
  • Lead discussions on designing the models on enterprise level.
  • Worked on SQL Server, DB2 and Teradata platforms.

Environment: E/R Studio 9.5, SQL Server 2014, DB2, Teradata14, ETL, OLAP, OLTP, Data Warehouse, Data Vault, Data Mart, Star/Snowflake schema

Confidential - Columbus, GA

Data Analyst/Data Modeler

Responsibilities:

  • Taking care of all day to day operations on databases which included incidents, changes and alerts.
  • Developed a Conceptual model using Erwin based on requirements analysis
  • Developed normalized Logical and Physical database models to design OLTP system for insurance applications
  • Created dimensional model for the reporting system by identifying required dimensions and facts using Erwin 9.5.
  • Used forward engineering to create a Physical Data Model with DDL that best suits the requirements from the Logical Data Model
  • Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using Unified Modeling Language (UML)
  • Involved in the analysis of the existing claims processing system, mapping phase according to functionality and data conversion procedure.
  • Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
  • Data modeling in Erwin; design of target data models for enterprise data warehouse (Teradata)
  • Created and Maintained Logical Data Model (LDM) for the project. Includes documentation of all Entities, Attributes, Data Relationships, Primary and Foreign Key Structures, Allowed Values, Codes, Business Rules, Glossary Terms, etc.
  • Developed the required data warehouse model using Star schema for the generalized model
  • Experienced in Oracle installations, upgrades, migration, designing logical/physical architecture, Tuning, Capacity planning, database access and Security and auditing.
  • Possess strong Conceptual and Logical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications.
  • Database backup, recovery & cloning using RMAN methodology.
  • Knowledge of OLAP, Dimensional Data Modeling, Operational Data Store (ODS) Snow-flake Modeling for Dimensions Tables using Analysis services and FACT.
  • Identify performance problems using wait event statistics. Monitor session-level wait events and collect historical data for root cause analysis Supported the development team at offshore in issue fixing, migrations of code & other support activities.
  • Collaborated with ETL, BI and DBA teams to analyze and provide solutions to data issues and other challenges while implementing the OLAP model.
  • Developed the Logical and physical data model and designed the data flow from source systems to Teradata tables and then to the Target system.
  • Designed and Developed Use Cases, Activity Diagrams, and Sequence Diagrams using Unified Modeling Language (UML)
  • Worked for cleansing and organizing various tables in a presentable manner to help with better understanding of already existing models.
  • Generated various presentable reports and documentation using report designer and pinned reports in ERWIN.
  • Knowledge of ITIL Processes, Problem Management, Change Management, Incident Management and Configuration Management as I have been a part of number of Pre-prod deployments and Prod deployments
  • Practical understanding of the Data modeling (Dimensional & Relational) concepts like Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.
  • Identified entities and attributes and developed conceptual, logical and physical models using ERWIN.

Environment: Erwin8.0, OLTP, DDL, UML, Star schema, Oracle11g, ETL, Star-Schema Modeling, Snowflake Schema Modeling, Fact and Dimension tables.

Confidential - Plano, TX

Data Analyst

Responsibilities:

  • Worked with Business users during requirements gathering and prepared Conceptual, Logical and Physical Data Models.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Optimized the existing procedures and SQL statements for the better performance using EXPLAIN PLAN, HINTS, SQL TRACE and etc. to tune SQL queries.
  • The interfaces were developed to be able to connect to multiple databases like SQL server and oracle.
  • Assisted Kronos project team in SQL Server Reporting Services installation.
  • Developed SQL Server database to replace existing Access databases.
  • Attended and participated in information and requirements gathering sessions
  • Translated business requirements into working logical and physical data models for Data warehouse, Data marts and OLAP applications.
  • Designed and created web applications to receive query string input from customers and facilitate entering the data into SQL Server databases.
  • Performed thorough data analysis for the purpose of overhauling the database using SQL Server.
  • Designed and implemented business intelligence to support sales and operations functions to increase customer satisfaction.
  • Converted physical database models from logical models, to build/generate DDL scripts.
  • Maintained warehouse metadata, naming standards and warehouse standards for future application development.
  • Extensively used ETL to load data from DB2, Oracle databases.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas
  • Wrote and executed unit, system, integration and UAT scripts in a data warehouse projects.
  • Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
  • Worked and experienced on Star Schema, DB2 and IMS DB.

Envirnment: Oracle, PL/SQL, DB2, Erwin7.0, Unix, Teradata SQL Assistant, Informatica, OLTP, OLAP, Data Marts, DQ analyzer.

We'd love your feedback!