Data Modeler/analyst Resume
2.00/5 (Submit Your Rating)
SUMMARY
- Data modeler wif 7+ years of experience in data analysis and modeling, wif excellent understanding of Data Warehouse and Data Mart designing.
- Extensive ETL tool experience working wif IBM Data Stage 7.5/8.1/8.5/8.7/9.1/11.3 and worked on DataStage client tools like DataStage Designer, DataStage Director and DataStage Administrator.
- Experienced in scheduling sequence, parallel and server jobs using DataStage Director, UNIX scripts and scheduling tools.
- Extensive experience in Informatica cloud services and creation and maintenance of database objects like tables, views, materialized views, indexes, constraints, primary keys, sequence, synonyms and database Link.
- Utilized of Informatica IDQ 8.6.1 to complete initial data profiling and matching/removing duplicate data.
- Created ETL test data for all ETL mapping rules to test the functionality of the Informatica graphs.
- Experience in converting SSIS packages & Hadoop Hive QL to Informatica.
- Good experience in Data Stage Administration, Information Server (IS).
- Experience in Data cleansing by matching user introduced data wif database data, removing duplicates, and extraction of relations from source systems using Quality Stage.
- Experience in Data Enrichment and Re - Engineering Using Quality Stage and DataStage.
- Expert in Database Design, Data modeling, Development, Implementation, ETL and Reporting in SQL Server 2005/2008/2008 R2/2012 wif expertise on data normalization.
- Conversant wif all phases of the Software Development Life Cycle (SDLC) especially Agile, Scrum, involving business process analysis, requirements gathering and analysis, detailed design, development, testing and post implementation support.
- Strong noledge of Data Warehouse concepts and technologies such as ETL processes, dimensional modeling, Star and Snowflake Schemas, reporting tools and surrogate key generation.
- Highly proficient in creating database objects like tables, indexes, views, user defined func
PROFESSIONAL EXPERIENCE
Confidential
Data Modeler/Analyst
Responsibilities:
- Performed System Study and Requirements Analysis, prepared Data Flow Diagrams, Entity Relationship Diagrams, Data Diagrams, Table Structures.
- Strong Data modeling experience using ER diagram, Dimensional data modeling, Conceptual/Logical/Physical Modeling using 3NormalForm (3NF), Star Schema modeling, Snowflake modeling using tools like Erwin, ER-Studio, SAP Power Designer.
- Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially wif large data sets.
- Understanding the OLAP processing for changing and maintaining the Warehousing, Optimizing Dimensions, Hierarchies and adding the Aggregations to the Cube.• Involved in Analyzing, designing, building &, testing of OLAP cubes wif SSAS 2008 and in adding calculations using MDX.• Extensively used Informatica Client tools- Source Analyzer, Warehouse Designer, Mapping Designer
- Developed Informatica mappings for data extraction and loading worked wif Expression, Lookup, Filter and Sequence generator and Aggregator Transformations to load the data from source to target.
- Conceptualized and developed initial and incremental data loads in Informatica using Update strategy transformation.• Designed ETL Process using ETL tools (Informatica) and Implementation of Data Movement, Error Capturing & Reporting, Initial & Delta Load, Implemented Change Data Capture methodology.
- Created DDL scripts for implementing Data Modeling changes.
- Created ERWIN crystal reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, coordinated wif DBAs' to apply the data model changes.• Extensively used ERWIN for REVERSE Engineering, FORWARD Engineering, SUBJECT AREA, DOMAIN, Naming Standards Document etc.
- Experienced in working wif Horton works distribution of Hadoop, HDFS, MapReduce, Hive, Sqoop, Flume, Pig, HBase, Kafka and MongoDB• Using Hadoop connector extracted Bank data from the source and loaded into Hadoop HDFS Data Lake
- Created data mapping documents to provide transformation rules to load the datafrom the source to target system and serve as an input to ETL jobs.• Good at Data Warehouse techniques -Dimensional data Modeling, Star Schema and Snowflake Schema
- Designed and developed star schema, snowflake schema and created fact tables and dimension tables for the warehouse and data marts using Erwin.• Data modeling and design of for data warehouse and data marts in star schema methodology wif Dimensions and Fact tables.
- Provided guidance to ETL team to translate the data mapping document into a high level design document and also during the creation of ETL job
- Designed and developed ETL processes using DataStage to load data from Teradata, Flat Files to staging database and from staging to the target Data Warehouse database.• Used DataStage stages namely Hash file, Sequential
Confidential
Data Modeler/Analyst
Responsibilities:
- Interacted wif business users and analysts and gathered and documented the technical and business metadata.
- Created various standard/reusable jobs in DataStage using various active and passive stages like Sort, Lookup, Filter, Join, Transformer, aggregator, Change Capture Data, Sequential file, Datasets.
- Worked as a Data Modeler/Analyst to generate Data Models using Erwin and developed relational database system.
- Involved in extracting the data using SSIS from OLTP to OLAP.
- Worked on OLAP Data warehouse, Model, Design, and Implementation.
- Created the conceptual model for the data warehouse using Erwin data modeling tool.
- Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and Snowflake schema) Concepts
- Expertise in OLAP/OLTP System Study, Analysis and E-R Modeling, developing database Schemas like Star Schema, Snowflake Schema, Conforming Dimensions and Slowly Changing Dimensions used in relational, dimensional and multi-dimensional modeling
- Conducted introductory and hands-on sessions of Hadoop HDFS architecture, Hive, Talend and Pig for other teams.
- Worked on Informatica Data Integration Tools - Such as Repository Manager, Designer, Workflow Manager, Workflow Monitor and Scheduled workflows using Workflow Manager.
- Worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
- Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center
- Delivered final source to target mapping and insert scripts to the Hadoop Developers.
- Designed and developed a customizable data management system using Hadoop to interface wif the current RBAC system.
- Implemented different Schema techniques like Snow Flakes and Star Schema
- Performed Reverse Engineering of the current application using ERwin, and developed Logical and Physical data models for Central Model consolidation
- Involved in ER diagrams (Physical and Logical using Erwin) and data mapping the data into database objects.
- Designed and created the Logical Data Model for the data mart
- Good working noledge of Meta-data management in consolidating metadata from disparate tools and sources including Data warehouse, ETL
- Relational Databases and third-party metadata into a single repository to get information on data usage and end-to-end change impact analysis.
- Developed and maintained data dictionary to create metadata reports for technical and business purpose.• Metadata mapping of data when data are transformed from operational environment to data warehouse environment.
- Performed transformations using various SSIS tasks such as conditional split, derived column, dat performed data scrubbing, including data validation checks during staging, before loading the data into the data wareho
Confidential
Data Modeler/Analyst
Responsibilities:
- Involved as ETL developer during analysis, planning, design, development, and implementation stages of multiple projects like UCUES (UC Undergraduate Experience Survey) and UC Path using IBM Web Sphere DataStage.
- Multiple stored procedures wif transaction processing for different modules.
- Used ERwin to transform data requirements into data models.
- Good work experience in Informatica Data Integration Tools - Such as Repository Manager, Designer, Workflow Manager, Workflow Monitor and Scheduled workflows using Workflow Manager.
- Worked on developing Informatica Mappings, Mapplets, Sessions, Workflows and Worklets for data loads.
- Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center
- Extensively used Informatica power Center 9.6.1 for ETL (Extraction, Transformation and Loading), date from relational tables.• Walked through the Informatica and Teradata code to identify protected information references of columns like SSN, Medicaid number, Last name and first name.• Involved in upgrading Informatica Power Center 9.5.1 to 9.6.1
- Worked wif Dimensional Data warehouses in Star and Snowflake Schemas, created slowly changing (SCD) Type1/2/3 dimension mappings using Ralph Kimball methodology
- Extensive experience in database activities like Data Modeling, Database Design & Development, Coding, Implementation, Maintenance and Performance Monitoring & Tuning using tools such as Index Tuning Wizard, SQL Profiler and Replication.
- Metadata mapping of data when data are transformed from operational environment to data warehouse environment.
- Expertise in SQL Server Analysis Services (SSAS) to deliver online analytical processing (OLAP) and data mining functionality for business intelligence applications.
- Experience in data transformation, data mapping from source to target database schemas, data cleansing procedures.
- Collaborated the data mapping document from source to target and the data quality assessments for the source data.
- Creation of SSIS packages to request a dump of weekly page views from Staples databases to our databases.
- Developed SSIS packages not only for Initial Data Loads but also for incremental/delta loads.
- Created a Log tables to capture the SSIS events and used it to Perform Debugging of packages.
- Created and scheduled a SSIS package to run our stored procedures to create the output report tables and regenerate reports automatically on weekly or monthly bases using SSRS 2005.
- Developed and supported the Extraction, Transformation and Load process (ETL) for a data warehouse from various data sources using DataStage Designer
- Creation of reference tables, mapping of page ids to the product line categories and the sources of information.• Tuned the parallel jobs using appropriate partitioning techniques used in the jobs and worked closely wif DBA to create the proper indexes to handle the
TECHNICAL SKILLS
- Skill Level
- Data Analysis
- Data Integrity
- Data Mapping
- Data Transformation
- Unspecified database
- Database Modeling
- Data Model
- Data Modeling
- Data Models
- Database Design
- Unspecified db2
- DDL
- Mapping
- Microsoft Access
- MS Access
- MS SQL Server
- MS SQL Server 2005
- SQL Server
- SQL Server 2005
- SQL Server 2008
- SQL Server 2012
- OLTP
- Oracle
- Oracle 10g
- PL/SQL
- Relational Database
- Replication
- Unspecified
- SQL
- Unspecified
- Stored Procedures
- Unspecified
- Apache Hadoop HDFS
- Unspecified
- Hadoop HDFS
- Unspecified
- HDFS
- Unspecified
- Apache Hadoop Mapreduce
- Unspecified
- Mapreduce
- Unspecified
- Apache Hadoop Sqoop
- Unspecified
- Sqoop
- Unspecified
- Data Cleansing
- Unspecified
- Data Integration
- Unspecified
- Data Management
- Unspecified
- Data Mart
- Unspecified
- Data Mining
- Unspecified
- Data Profiling
- Unspecified
- Data Sources
- Unspecified
- Data Warehouses
- Unspecified
- Datasets
- Unspecified
- Datastage
- Unspecified
- ETL
- Unspecified
- Flume
- Unspecified
- Hadoop
- Unspecified
- Hadoop Hive
- Unspecifie