Sr. Data Modeler Resume
4.00/5 (Submit Your Rating)
Quincy, MA
SUMMARY:
- 8 years of extensive experience in Data Modeling, Data Analysis, Data mapping for Online Transactional Processing (OLTP), Data Warehousing (OLAP)/Business Intelligence (BI) applications.
- Worked as a Data Modeler in both Forward Engineering as well as Reverse Engineering using data modeling tools.
- Efficient in data mart design, creation of cubes, dimensional data modeling, identifying Facts and Dimensions, Star Schema and Snowflake Schema.
- Good knowledge of Data Vault Modeling.
- Proven knowledge in data cleansing, data transformation, data mapping specifications from source to target database, Data warehouse loads, determining hierarchies, building various logics to handle Slowly Changing Dimensions.
- Experience with Big data technologies like Hadoop, Big Query, MongoDB, Hive, HBase, Pig, Cassandra, MongoDB
- Worked in Data Warehouse Data &ETL Architecture, Oracle Database platform, Data Modeler, Informatica Powercenter and SSIS
- Strong understanding of Teradata SQL Assistant, Teradata Administrator and data load/ export utilities like BTEQ, Fast Load, Multi Load, Fast Export.
- Experienced in working with BI Reporting tools such as Micro - Strategy, Business Objects and SSRS producing tables, reports, graphs and listings using various procedures performing complex data manipulations.
- Experienced in Data Extraction/Transformation/Loading (ETL), Data Conversion and Data Migration by using SQL Server Integration Services (SSIS), PL/SQL Scripts and Informatica against heterogeneous data sources.
- Experienced in dealing with different data sources ranging from flat files, Excel, Oracle, Sybase, SQL server, Teradata, DB2 and multi-dimensional database.
- Created DDL scripts for implementing Data Modeling changes. Published Data model to acrobat PDF files, extensively created reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-ordinated with DBAs to apply the data model changes
- Extensive Experience in writing system specifications, translating user requirements to technical specifications, created/maintained/modified data base design document with detailed description of logical entities and physical table.
- Experienced in developing Database Design Document including Data Model: Conceptual, Logical and Physical Models using . General understanding of Database Management System and Data Warehouse including their functional and technical architecture and the design of Data Flow Diagram.
- Worked in Data Warehouse Data &ETL Architecture, Oracle Database platform, Data Modeler, Informatica Powercenter and SSIS. Expertise in Normalization techniques for data consistency and Flexible Database design. Experience in writing data quality rules to ensure quality, consistency and accuracy of data.
- Good Knowledge of ETL Processes such source, mapping, transformation, staging areas and created various ETL documents such as ETL Mapping documents, Data Mapping documents, ETL test scripts.
- Good knowledge of designing Entity Relationship and Data Flow Diagram for Business processes. Strong understanding of data modeling and dimensional modeling concepts like Slowly Changing Dimensions.
- Have good experience on data warehouse with SSIS package (ETL Testing) as well as BI and worked on Adhoc reporting tools such as Tableau and Cognos reporting.
- Proficiency in data warehousing techniques for data cleansing, Slowly Changing Dimension phenomenon, surrogate key assignment and change data capture.
- Proficient in implementing and providing business intelligence on data warehouse and data mart solutions.
- Excellent knowledge in identifying performance bottlenecks and tuning the Informatica Load for better performance and efficiency.
- Expertise in implementing complex Business rules by creating robust mappings and Partitioning.
PROFESSIONAL EXPERIENCE:
Confidential, Quincy, MA
Sr. Data Modeler
Responsibilities:
- Involved analysis of a variety of source system data, coordination with subject matter experts, development of standardized business names and definitions, construction of a non-relational data model, publishing of a data dictionary, review of the model and dictionary with subject matter experts and generation of data definition language.
- Daily scrum with entire team and BI Reporting Teams. Provided data provisioning updates and outstanding issues that affected the team. Webi, JSON; Efficiency aided via links to Sharepoint.
- Implemented end-to-end systems for Data Analytics, Data Automation and integrated with custom visualization tools using R, Mahout, Hadoop and MongoDB.
- Designed and developed Informatica mapping codes to build business rules to load data. Extensively worked on Informatica Lookup, stored procedure and update transformations to implement complex rules and business logic.
- Experience in Testing the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases.
- Interacted with End user community to understand the business requirements and analyzed them and designed specification document. Lead multiple project teams through all phases of the SDLC using technologies including, SQL and Data Warehousing.
- Created ETL jobs to load financial JSON data and server data into Mongo DB and transported Mongo DB into the Data Warehouse.
- Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
- Worked and extracted data from various database sources like Oracle, SQL Server, DB2, and Teradata.
- Develop sample data files from JSON models to validate entity schemas before integration.
- Exposure to implementing Extraction mapping in Informatica (Mapping Designer).
- Created data flow diagrams from source to staging to fact/dimensions (Warehouse) and created S2T mapping documents with required business rules. Created dimensional model for the reporting system by identifying required grain of fact, dimensions and facts.
- Validated and updated the appropriate LDM's to process mappings, screen designs, use cases, business object model, and system object model as they evolve and change.
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, Flat files, SQL Server with high volume data. Worked on multiple data base platforms such as SQL server, Oracle & Tera Data.
- Wrote features to filter raw data by JSON processor from Big Query, AWS SQS, and Publishing API. The data was modified and uploaded to AWS DynamoDB by Python to ensure that the streaming data pipeline safety.
- Responsible for defining the key identifiers for each mapping/interface. Created Technical specifications documents based on the functional design document for the ETL coding to build the data mart.
- Work with the Data Architects to create the draft models that contain necessary properties to meet the requirements of the model and translate them into a set of JSON entity schemas used in the API development process.
- Used Informatica Designer, Workflow Manager and Repository Manager to create source and target definition, design mappings, create repositories and establish users, groups and their privileges.
- Performed Data mapping between source systems to Target systems, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
- Translated business requirements, worked with the Business Analyst and DBA for requirements gathering, business analysis, and testing and project coordination. Employed process and data analysis to model a Customer Information Business System.
- Developed data mapping documents for integration into a central model and depicting data flow across systems.
- Worked in the capacity of ETL Developer (Oracle Data Integrator (ODI)/PL/SQL) to migrate data from different sources in to target Oracle Data Warehouse.
- Architected and converted a SQL Server database into MongoDB.
- Wrote an application in C# to bulk insert data from SQL Server into MongoDB.
- Involved in designing the Informatica mappings by translating the business requirements and extensively worked on informatica lookup, update and router to implement the complex rules and business logic.
Confidential, Charlotte, NC
Data Modeler
Responsibilities:
- Worked Extensively on Various Projects implementing Database changes in the respective Data Models and implementing Reverse Engineering of Database Structure to enhance the models and also comparing the models to in corporate changes.
- Wrote database interface specifications and documented in Data Manager data dictionary.
- Extensively used and Normalization Techniques to design Logical/Physical Data Models, relational database design.
- Implemented end-to-end systems for Data Analytics, Data Automation and integrated with custom visualization tools using R, Mahout, Hadoop and MongoDB.
- Hands-on data modeling expertise for relational databases both for logical and physical data models including detailed knowledge of normalization techniques, forward-engineering and reverse-engineering techniques.
- Used the RegEx and JSON for serialization and de-serialization packaged with Hive to parse the contents of streamed log data.
- Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas, creating DDL scripts, data dictionary, publishing model to PDF and HTML format, generating various data modeling reports etc.
- Worked with Data governance, Data quality, data lineage, Data architect to design various models and processes.
- Designed the ETL processes using Informatica tool to load data from File system and Oracle into the target SQL Developer.
- Extract data from Oracle as JSON file with PLSQL code.
- Imported the JSON file into MongoDB Collections using mongo import utility
- Involved in Extraction, Transformation and Loading job of the data warehouse using Informatica.
- Developed mappings/sessions using Informatica powercenter for data loading.
Confidential, Dallas, TX
Sr. Data Analyst
Responsibilities:
- Analyzed the business requirements and designed Conceptual and Logical Data models using informatica and generated database schemas and DDL (Data Definition Language) by using Forward and Reverse Engineering.
- Experience in Informatica Data Quality (IDQ-Informatica developer for cleansing and formatting customer master data.
- Delivered numerous POCs that empowered network analysis, D3.js, different GraphDBs, and MongoDB.
- Identified and designed business Entities and attributes and relationships between the Entities to develop a Conceptual model and Logical model and then translated the model into Physical model.
- Implemented Normalization and De-Normalization Techniques to build the tables, indexes, views and maintained and implemented stored procedures as per requirements.
- Performed Data Analysis and Data Profiling using complex SQL queries on various sources systems including Oracle, Teradata.
- Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
- Used ad hoc queries for querying and analyzing the data, participated in performing data profiling, data analyzing, data validation and data mining.
- Involved in Extraction, Transformation and Loading job of the data warehouse using Informatica PowerCenter.
- Extensively used SQL Loader to load data from the Legacy systems into Oracle databases using control files and used Oracle External Tables feature to read the data from flat files into Oracle staging tables.
- Developed complex ETL mappings for Stage, Dimensions, Facts and Data marts load.
- Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain SDLC and Migration process. Used Talend for Extraction and Reporting purpose.
- Have actively taken part in Data Profiling, Data Cleansing, Data Migration, Data Mapping and actively helped ETL developers to Compare data with original source documents and validate Data accuracy.
- Worked on Tableau for Data Analysis, Digging the data for source systems for analysis and deeply dive in the data for Predictive findings and for various data Analysis by using dash boards and visualization.
Confidential, Pittsburgh, PA
Data Analyst
Responsibilities:
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Worked with partially adjudicated insurance flat files, internal records, 3rd party data sources, JSON, XML and more.
- Involved in designing the informatica mappings by translating the business requirements and extensively worked on informatica lookup, update and router to implement the complex rules and business logic.
- Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
- Troubleshooted test scripts, SQL queries, ETL jobs, data warehouse/data mart/data store models.
- Responsible for different Data mapping activities from Source systems to Teradata
- Created the test environment for Staging area, loading the Staging area with data from multiple sources.
- Responsible for analyzing various data sources such as flat files, ASCII Data, EBCDIC Data, Relational Data (Oracle, DB2 UDB, MS SQL Server) from various heterogeneous data sources.
- Tested the ETL process for both before data validation and after data validation process. Tested the messages published by ETL tool and data loaded into various databases