We provide IT Staff Augmentation Services!

Data Modeler Resume

4.00/5 (Submit Your Rating)

Columbia, MD

SUMMARY:

  • 15+ yrs of extensive experience in design, development and implementation of data warehouses as a Data Modeler. Strong experience in the entire life cycle of Data Warehouse/Datamart using Data Modeling practices. Strong Knowledge of Data Warehousing concepts (Ralph Kimball and Bill Inmon approach) and Dimensional modeling (Star schema and Snowflake schema).
  • 8+ years Strong Data Modeling experience using Dimensional Data Modeling, Star Schema/Snowflake schema, FACT & Dimensions tables, Physical, logical and conceptual data modeling.
  • Extensive experience with ER (Entity Relationship) Data Modeling, mostly on live and transactional systems with ability to convert complex business ideas in Data Models.
  • Proven experience in Data analysis, Data modeling, Data management, documentation and presentation.
  • Extensive experience as an ETL Developer on Informatica Power Center/Power Mart 7.1/7.3/8.1/9.0 , 9.5,9.6 and experience as an ETL Designer.
  • Developed in databases such as Oracle 8.x/9.x/10g, my Sql (mysql), postgre SQL(PostgreSQL) 8.x,9.x, Sybase 11.x/12.x,ASE, IQ, MS SQL SERVER 2000/2005/2008 , MS Access, Teradata, Netezza, SQL*Plus, SQL* Loader using Toad, Sql Developer, DBArtisan, Rapid SQL and pg admin 3.
  • Good Experience of back end programming using PL/SQL, Triggers, Stored procedures, database table design, indexes, performance tuning in Oracle.
  • Proficient in coding UNIX shell scripts.
  • Performance tuning for mapping and session and Oracle PL/SQL blocks.
  • Hands on experience of Business Objects. 5.0.1, Power Analyzer 4.0.
  • Used Application packages such as Visual Source Safe 6.0 and MS Office 2000.
  • Presentable documentation, communication skills, problem solving, and analytical capabilities.

TECHNICAL SKILLS:

ETL: Informatica 10.0/9.0/ 8.6/8.1/7.1/ 6.1/5.1

Data Modeling: Power Designer 16.x, Erwin 9.x, ER Studio 9.0

DBMS/RDBMS: Oracle 8i/9I/10g, Teradata 5.1, SQL Server 2008, mysql, postgresql 9.0, Sybase ASE/IQ, Netezzza TwinFin 6.0

Programming Language: SQL, PL/SQL, Python 2.x

BI/Reporting Tool: Business Object 5.1, Power Analyzer, Tableau, Power BI

Operating System: Windows Family, Unix, Linux

PROFESSIONAL EXPERIENCE:

Confidential, Columbia, MD

Data Modeler

Responsibilities:

  • Data modeling of the Salesforce CRM system to create and enhance the physical and logical data model design using Power Designer 15.x and now 16.x
  • Use forward and reverse engineering methods to compare and enhance the star schema data models.
  • Perform proof of concept analysis to assess the feasibility of projects.
  • Create Requirement Documents as per Business specifications.
  • Analyze data inaccuracies within databases and recommend process improvements or system changes to enhance overall quality of the data.
  • Perform various Data Cleansing methods like: Missing Records or Attributes, Redundant Records, Missing Keys or other required data, erroneous records and inaccurate data.
  • Perform Data Profiling on various source systems to ensure the consistency of data.
  • Develop, document, enhance and enforce data quality standards.
  • Create Data Dictionaries; perform metadata management for various Data Marts.
  • Worked extensively in Netezza, Oracle, SQL*PLUS, Query performance tuning, created DDL scripts, created database objects like Tables, Indexes, Synonyms, Sequences, stored procedures etc. Closely worked with DBAs to improve SQL performance and involved in Data Import/Export.

Confidential, TN

Data Modeler and Data Integration Lead

Responsibilities:

  • As Data modeler, my primary role was to overview the Design of all the Data Models in the end to end Delivery Process. The tool of Choice is ER Studio Data Architect 9.0
  • Created Conceptual, Logical and Physical Models for various applications and MAM Data Warehouse. The Physical models consisted of Various DB’s like Oracle, MySQL and PostgreSQL
  • Use forward and reverse engineering methods to compare and enhance the star schema data models.
  • Perform Data Profiling on various source systems to ensure the consistency of data.
  • Worked extensively in loading the Dimension and facts. Handled change data capturing for dimensions.
  • Create Data Dictionaries; perform metadata management for various Data Marts.
  • Created Wiki Documents for Requirements and maintained Data Models on Scripps Wiki pages.
  • Created ETL maps using Web Service, JSON and Complex XML feeds.

Confidential, NE

Data Modeler and Data Warehouse Architect

Responsibilities:

  • Data modeling of the Salesforce CRM system to create and enhance the physical and logical data model design using Power Designer 15.x and now 16.x
  • Use forward and reverse engineering methods to compare and enhance the star schema data models.
  • Perform proof of concept analysis to assess the feasibility of projects.
  • Create Requirement Documents as per Business specifications.
  • Analyze data inaccuracies within databases and recommend process improvements or system changes to enhance overall quality of the data.
  • Perform various Data Cleansing methods like: Missing Records or Attributes, Redundant Records, Missing Keys or other required data, erroneous records and inaccurate data.
  • Perform Data Profiling on various source systems to ensure the consistency of data.
  • Develop, document, enhance and enforce data quality standards.
  • Create Data Dictionaries; perform metadata management for various Data Marts.
  • Developed ETL Designs to Load the Salesforce Inbound and Outbound (EDW) Processes.
  • Worked extensively in loading the Dimension and facts. Handled change data capturing for dimensions.
  • Created mapping/mapplets, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, mapplets etc.
  • Used parameter files extensively while loading the data, used external loaders like SQL Loader informatica external load etc.
  • Designing the ETLs in Informatica 9 to Load the ODS and DWH (star schema) using standard Kimball methodology.
  • Created utility programs for file manipulations, pre - session, post-session using shell scripts and also used Autosys for scheduling workflows.
  • Worked extensively in Netezza, Oracle, SQL*PLUS, Query performance tuning, created DDL scripts, created database objects like Tables, Indexes, Synonyms, Sequences, stored procedures etc. Closely worked with DBAs to improve SQL performance and involved in Data Import/Export.

Confidential, Miami, FL

Data Modeler and Data Warehouse Architect

Responsibilities:

  • Data modeling of the Data Warehouse system to create and enhance the physical and logical model design using Power Designer 12.x
  • Use forward and reverse engineering methods to compare and enhance the star schema data models.
  • Involved in collecting user requirements from various source systems like Launchpad, CCL.com, Mainframe drains etc. and then data integration using daily snapshot date.
  • The data gets processed through standard DWH methodology, loading in ODS and then in the RMS data warehouse.
  • Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using Informatica Designer 8.1.1, Oracle PL/SQL to extract the data from multiple source systems like Oracle 8i/9i/10g, Sybase 11.x, SQL Server 7.2, MS Access, flat files to multiple targets like Oracle and Flat files.
  • Developed ETL on Informatica 8.1.1.
  • Worked extensively in loading the Dimension and facts. Handled change data capturing for dimensions.
  • Created mapping/mapplets, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, mapplets etc.
  • Used parameter files extensively while loading the data, used external loaders like SQL Loader informatica external load etc.
  • Worked extensively in Oracle PL/SQL, SQL*PLUS, Query performance tuning, created DDL scripts, created database objects like Tables, Indexes, stored procedures etc.
  • Created utility programs for file manipulations, pre-session, post-session using shell scripts and also used Crontab/Autosys/Opcon for scheduling workflows.

Confidential, NJ

Data Modeler/Data Integration Lead

Responsibilities:

  • Developed logical and physical data models using Erwin Data modeler 7.x.
  • Used forward and reverse engineering methods to compare and enhance the star schema data models.
  • Perform proof of concept analysis to assess the feasibility of projects.
  • Create Requirement Documents as per Business specifications.
  • Analyze data inaccuracies within databases and recommend process improvements or system changes to enhance overall quality of the data.
  • Perform various Data Cleansing methods like: Missing Records or Attributes, Redundant Records, Missing Keys or other required data, erroneous records and inaccurate data.
  • Involved in collecting user requirements from various source systems like GL, BEN, PATHFINDER, etc. and designing the Data Warehouse (logical and physical models), designing and developing dimensional data models (star schema).
  • Developed and maintained ETL (Data Extraction, Transformation and Loading) mappings using Informatica Designer 7.3 to extract the data from multiple source systems like Oracle 8i/9i, Sybase 11.x, SQL Server 7.2, MS Access, flat files to multiple targets like Oracle and Flat files.
  • Worked extensively in loading the Dimension and facts. Handled change data capturing for dimensions like Client.
  • Created mapping/mapplets, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, mapplets etc.
  • Performed production support to the existing mappings.
  • Used parameter files extensively while loading the data, used external loaders like SQL Loader informatica external load etc.
  • Participated in the code walkthroughs and mentoring the team members.
  • Participated in providing the Project Estimates for development team efforts for offshore as well as on-site.
  • Worked extensively in Oracle, SQL*PLUS, Query performance tuning, created DDL scripts, created database objects like Tables, Indexes, Synonyms, Sequences, stored procedures etc. Closely worked with DBAs to improve SQL performance and involved in Data Import/Export.
  • Created utility programs for file manipulations, pre-session, post-session using shell scripts and also used Crontab/Autosys for scheduling workflows.
  • Involved in ETL testing, created Unit test plan and Integration test plan to test the mappings and created the test data.

Confidential

ETL Developer

Responsibilities:

  • Designed the Data Mart (star schema, logical and physical models), and supported the UAT and production phases.
  • Created the Transformation Rules, Mapping Specifications and Workflow Specifications.
  • Involved in doing the Impact Analysis of the changes done to the existing mappings and providing the estimations.
  • Created the design specs for loading the data from OLTP to STG Schema, STG schema to DWH Schema.
  • Used ERWIN to design the Logical/Physical Data Models, forward/reverse engineering
  • Developed and maintained ETL mappings using Informatica Designer 7.3 to extract the data from multiple source systems that comprise of Oracle 8i/9i, Sybase 11.x, SQL Server 2000, MS Access, flat files to multiple targets like Oracle and Flat files.
  • Migrated mappings from Development Repository to Test Repository and also from Test Repository to Production Repository.
  • Created extensive mapping/mapplets, reusable transformations using all the transformations like Normalizer, lookup, filter, expression, stored procedure, aggregator, update strategy, worklets etc.
  • Extensively worked on tuning and improving the Performance of Mappings.
  • Created utility programs for file manipulations, pre-session, post-session using shell scripts & AWK and also used crontab for scheduling workflows.
  • Performed ETL testing, created Unit test plan and Integration test plan to test the mappings and created the test data.

Resume of Supragya Mishra

We'd love your feedback!