We provide IT Staff Augmentation Services!

Data Modeler Resume

Dallas, TX

SUMMARY

  • An Information technology professional with more than 3 years of extensive Information Technology experience in all phases of Software development life cycle includingSystem Analysis, Design, Data Modeling, Implementation and Support of various applications inOLTP, Data Warehousing, and OLAP applications.
  • Data Modelerwith strongConceptual and Logical Data Modeling skills, has experience with JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries.
  • Extensive experience inRelational and Dimensional Data modeling for creating Logical and Physical Design of Database and ER Diagramsusing multiple data modeling tools likeERWIN, ER Studio.
  • Worked extensively on forward and reverse engineering processes. Created DDL scripts for implementingData Modelingchanges. Created ERWINreports in HTML, RTF format depending upon the requirement, published Data model in model mart, created naming convention files, co-coordinated with DBAs to apply the data model changes.
  • Extensive experience in writing functional specifications, translating business requirements to technical specifications, created/maintained/modified data base design document with detailed description of logical entities and physical tables.
  • Excellent knowledge of waterfall and spiral methodologies ofSoftware Development Life Cycle(SDLC).
  • Possess strong Documentation skill and knowledge sharing among Team, conducted data modeling review sessions for different user groups, participated in requirement sessions to identify requirement feasibility.
  • Extensive Experience working with business users/SMEs as well as senior management.
  • Strong understanding of the principles ofData warehousing, Fact Tables, Dimension Tables, star and snowflake schema modeling.
  • Experience in backend programming includingschema and table design, stored procedures, Triggers, Views, and Indexes.
  • Excellent analytical, inter-personal and communication skills with a strong technical background.

TECHNICAL EXPOSURE

Data Modeling

Dimensional Data Modeling, Relational Data Modeling, Star and Snowflake Schema, Fact and Dimension Tables, Conceptual, Logical and Physical Data Modeling, ER Studio 7.1.1, ERwin 4.0/3.5.2/3.x, Power Designer.

Data Warehousing

Informatica PowerCenter 8.1/8.0/7.1/7.0/6.2/6.1/5.2, Informatica PowerMart 4.7, PowerConnect, Power Exchange, Data Profiling, Data cleansing, OLAP, OLTP, SQL*Plus.

Databases

Oracle 10g/9i/8i/8.0/7.0, DB2 8.0/7.0, SQL Server 12.0/11.x, MS SQL 7.0, SQL Server 2000/2005, MS Access 7.0/97/2000, Quest Central DB2.

Environment

UNIX, LINUX, Windows95/98/2000/XP.

Programming Languages

Matlab, C, C++

PROFESSIONAL EXPERIENCE

Confidential,Dallas, TX May 2009 – Present
Data Modeler/On-site Project Coordinator

Responsibilities:

  • Participated in requirement gathering session with business users and sponsors to understand and document the business requirements as well as the goals of the project.
  • Created and reviewed the conceptual model for the EDW (Enterprise Data Warehouse) with business users.
  • Analyzed the source system (JD Edwards) to understand the source data and JDE table structure along with deeper understanding of business rules and data integration checks.
  • Identified various facts and dimensions from the source system and business requirements to be used for the data warehouse.
  • Created the dimensional logical model with approximately 10 facts, 30 dimensions with 500 attributes using ER Studio.
  • Implemented the Slowly changing dimension scheme (Type II) for most of the dimensions.
  • Implemented the standard naming conventions for the fact and dimension entities and attributes of logical and physical model.
  • Reviewed the logical model with Business users, ETL Team, DBA’s and testing team to provide information about the data model and business requirements.
  • Created the DDL scripts using ER Studio and source to target mappings (S2T- for ETL) to bring the data from JDE to the warehouse.
  • Worked with DBA to create the physical model and tables. Scheduled multiple brain storming sessions with DBAs and production support team to discuss about views, partitioning and indexing schemes case by case for the facts and dimensions.
  • Worked on the model based volumetric analysis and data based volumetric analysis to provide accurate space requirements to the production support team.
  • Worked on Mercury Quality Center to track the defect logged against the logical and physical model.
  • Worked as an onsite project coordinator once the design of the database was finalized in order to implement the data warehouse according to the implementation standards.
  • Worked with client and off shore team to make sure that the reports and dashboards are delivered on time.
  • Participated in UAT sessions to educate the business users about the reports, dashboards and the BI System. Worked with the test team to provide insights into the data scenarios and test cases.
  • Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.

Environment – ER Studio 8.0.3, Microsoft SQL 2008 Server, Microsoft SQL Management Studio, Microsoft SQL 2008 Integration Services, Microsoft SQL 2008 Reporting Services, Microsoft SQL 2008 Analysis Services, Mercury Quality Center 9.

Confidential,Minneapolis, MN Nov 2007 – April 2009
Data Modeler/Data Analyst

Responsibilities:

  • Participated in JAD session with business users and sponsors to understand and document the business requirements in alignment to the financial goals of the company.
  • Created the conceptual model for the data warehouse with emphasis on insurance (life and health), mutual funds and annuity using EMBARCADERO ER Studio data modeling tool.
  • Reviewed the conceptual EDW (Enterprise Data Warehouse) data model with business users, App Dev and Information architects to make sure all the requirements are fully covered.
  • Analyzed large number of COBOL copybooks from multiple mainframe sources (16) to understand existing constraints, relationships and business rules from the legacy data.
  • Worked on rationalizing the requirements across multiple product lines
  • Reviewed and implemented the naming standards for the entities, attributes, alternate keys, and primary keys for the logical model.
  • Created the logical model for the EDW with approximately 75 entities and 1000 attributes using ER Studio. The logical model was fully attributed till 3rd normalization and contains both current and history tables. Data model is divided in number of sub models for the ease of understanding and comprehension.
  • Reviewed the logical model with application developers, ETL Team, DBAs and testing team to provide information about the data model and business requirements.
  • Worked with ETL to create source to target mappings (S2T).
  • Worked with DBA to create the physical model and tables.
  • Worked on Mercury Quality Center to track the defect logged against the logical and physical model.
  • Had brain storming sessions with application developers and DBAs to discuss about various denormalization, partitioning and indexing schemes for physical model.
  • Worked on Requirements Traceability Matrix to trace the business requirements back to logical model.

Environment – ER Studio 7.1.1, Quest Central for DB2 v 4.8, COBOL copybooks, Mainframe DB2, Mercury Quality Center 9,Informatica PowerCenter 8.1

Confidential,Shawnee, KS June 2007 – Nov 2007
Data Modeler/Data Analyst

Responsibilities:

  • Analyzed existing logical data model (LDM) and made appropriate changes to make it compatible with business requirements.
  • Expanded Physical Data Model (PDM) for the OLTP application using ERwin.
  • Experienced in IDEF1X and IE data modeling methodologies.
  • Involved in data model reviews with internal data architect, business analysts, and business users with explanation of the data model to make sure it is in-line with business requirements.
  • Created rationalized domains to bring consistency in the tables.
  • Identified source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
  • Worked with cross-functional teams and prepared detailed design documents for production phase of current customer database application.

Environment: Erwin 4.0, Erwin 3.5.2, Toad, PL/SQL, Oracle 9i, SQL Server 2000, SQL*Loader, UNIX, Windows 2005

ConfidentialJune 2004 – Nov 2004
Data warehouse design

  • Received training on SQL Server 2000. Created number of tables, functions, triggers, stored procedures, views etc. in SQL server.
  • Participated in business requirement gathering sessions.
  • Worked on identifying facts, dimensions and various other concepts of dimensional modeling which are used for data warehousing projects.
  • Trained in Inmon and Kimball approaches for data warehouse design.
  • Worked on Normalization and Denormalization techniques.
  • Trained on building Conceptual, Logical and Physical data model.
  • Defined relationships and cardinalities among entities.
  • Developed queries using PL/SQL and many stored procedures to do the validations.
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Database triggers etc).

EDUCATION

M.S.(Electrical and Computer Engineering)

Research Experience

  • Worked in a team to design the data base for the image analysis.
  • Worked on SQL - Matlab interface so as to bring the image data directly in the Matlab application to perform analysis thereafter.
  • Performed data analysis and feature classification techniques for real time airborne imagery.
  • Developed a front-end application in Matlab to facilitate analysis of airborne imagery.
  • Developed an application to register the aerial images.
  • Analyzed the images for mine field detection using RX anomaly algorithms.
  • Completed error analysis (Mean Error, Variance of the error) on the results from various flight simulation exercises.

Environment: Mat Lab, C++, MS SQL, DB2 Database.

Projects

  • Developed an application in Matlab for evaluating various Adaptive filter algorithms for different communication channels for both sparse and dispersive echo paths.
  • Developed a feed forward back propagation neural network for zip code recognition.
  • Implemented Goertzel algorithm for DTMF detection on DSP kit 6713.
  • Developed Matlab application for the registration of the images with both rotation and translation using Fast Fourier Transform.
  • Developed a Matlab application for acoustic communication involving transmission of binary data in presence of noise.

HONORS & ACTIVITIES

  • Member of Institute of Electrical and Electronics Engineers (IEEE).
  • Member of winning team for Soccer tournament organized by Indian Association.
  • Member of winning team for Volleyball tournament organized by Indian Association.

B.E. (Electronic and Communication Engineering)

Projects

  • Developed an Intelligent building using microcontroller 8051 based on Smart Card concept.
  • Developed a Digital frequency meter.

Hire Now