Sr. Data Modeler Resume
Chicago, IL
OBJECTIVE
- Around Seven years of professional experience in Data modeling, design, development and implementation of OLTP, Data Warehouse/Data Marts, and looking for an opportunity which would help me implement my technical expertise toward making a positive impact on the growth of your organization and also gives me an opportunity to learn more in the field of Data Modeling.
SUMMARY
- Highly motivated Data Analyst with 7 years of experience in Information technology industry with solid understanding of Data Modeling, Evaluating Data Sources and strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications in different environments such as Healthcare, Mortgage, Backed Securities, Derivative, Equities, Corporate Bonds and Credits domains.
- Extensive knowledge with Dimensional Data modeling, Data Mining, Data flow analysis, Star Schema/snowflakes schema, process mapping using the top - down and bottom-up approaches, Fact and Dimension tables.
- Expert using modeling tools like Erwin, Power Designer and ER Studio. Worked and extracted data from various database sources like Oracle, MS SQL, PL/SQL, DB2 and Flat files.
- Strong knowledge of all phases of the Software Development Life Cycle (SDLC) including requirements gathering, analysis, design, implementation, testing and deployment as well as software engineering methodologies like Rational Unified Process (RUP), Agile.
- Experience in business process analysis/modeling, business requirements, and developing High level Designs for the Business applications.
- Extensive experience in creating Business Requirement Documents (BRD), Use Cases, Business Systems Designs (BSD), Application Information Document (AID), Project Plans, Requirements Traceability Matrix (RTM) throughout the Life Cycle of a project.
- Experience in Performance Management to support strategic planning, goal setting, strengthen accountability, enhance decision making and to improve client satisfaction.
- Ability to interact with individuals at all levels with exceptional skills in forming and facilitating Joint Application Development (JAD) and Rapid Application Development (RAD) sessions.
- Expertise in requirement analysis, gap analysis, impact analysis, cost/benefit analysis, risk analysis, testing, development of functional specifications and project planning.
- Experience in test Data Management using IBM optimization tool for performance management.
- Experience in SOA analysis and design, for data quality analysis to influence service implementation services.
- Experience as BI Data Analyst, analyzing facts and dimensions as per client’s requirements.
- Experience with working with Oracle database 11g, Teradata database, My SQL server, MS Access
- Exceptional Documentation Skills for writing Use Cases and Functional Requirement Documents as well as for creating Use Case diagrams, Activity diagrams, and Data flow diagrams based on UML Methodology and business process flow diagrams using MS Visio.
- Experience writing analytical SQL queries to perform data analysis and data validation and quality check.
- Proven track record in troubleshooting on Technical areas of Database Architecture, Design, and Optimizing ETL jobs and addressing production issues with respect to performance tuning and enhancement.
- Experienced in BI Reporting tools such as Business Objects, COGNOS and Oracle Discoverer.
- Experienced with querying databases using SQL 2005/2008 and Perl scripts.
- Experienced working with offshore vendors and establishing offshore teams and processes.
TECHNICAL SKILLS
Databases: Teradata V2R5/ V2R6/ V2R12, Oracle 9i/10g/11g, IBM DB2, MS SQL Server 2000, MS Access.
ETL Tools: Informatica, IBM Infosphere DataStage and QualityStage, SQL Server Integration Services (SSIS).
Data Modeling: Relational,Star-Schema Modeling, Snowflake, FACT, Erwin,ER-STUDIO
Analytics & Reporting: Unica, Cognos, MicroStrategy, Business Objects XI R3, SSRS.
Database Tools/SQL Tools: TOAD, Squirrel, Oracle SQL Developer, PL/SQL Developer, SQL Explorer, Teradata SQL Assistant.
Methodologies: Star schema database design, structured systems analysis and design.
Repository Tools: Microsoft SharePoint, Rational Rose, Requisite Pro, C
Testing Tools: QTP, Load Runner.
Documentation: UML Diagrams, Data Mapping Document, Data dictionary, User specification document, Technical Specification Document, Process Documents.
Defect/Test Management Tool: Quality Center, Test Director.
Programming Languages: SQL, C, C++., Matlab
Operating Systems: UNIX, Mainframes, Microsoft Windows (1997/2000/XP/NT/Vista, 7 &8).
Microsoft Tools: Microsoft Office Visio, Microsoft Office Project Office Suite (Word, Excel, Outlook, Publisher, PowerPoint, Desktop Deployment.
PROFESSIONAL EXPERIENCE
Confidential, Chicago, IL
Sr. Data Modeler
Responsibilities:
- Involved in Requirements gathering and gap analysis between current state and future state of the applications
- Acted as Liaison between managers, stakeholders and Architecture, Design and Development teams
- Assisted in developing logical & physical modeling as per business requirements using ERWIN
- Converted the logical model into physical model which was a dimensional model with facts and dimensional tables for the Operational Data Warehouse.
- Used Erwin Model Manager to manage the model repository and had to update the Model Mart everytime we had made changes to the Existing Model.
- Executed the scripts and Created the physical model in Dev Environment with Fact and Dimension tables.
- Importing and exporting data into HDFS and Hive using Sqoop.
- Involved in loading data from UNIX file system to HDFS.
- Developed Hive Queries for Analysis.
- Developed Pig Latin scripts to extract the data.
- Developed Pig Scripts for ETL Procedures.
- Involved in loading data from UNIX file system to HDFS and in creating Impala tables.
- Analyzed the source data coming from Oracle, flat Files, and MS Excel coordinated with data warehouse team in developing Dimensional Model.
- Implemented various mappings to transform data from source to target files in Source Analyzer, Warehouse, Designer and Mapping Designer in Informatica Designer.
- Created various mappings like Update Strategy, Look up, Sequence Generator, expression, Aggregator, XML, Stored Procedure and tested the mappings for data verification in Designer.
- Designed Dimensional data models by implementing Star and Snow dimension and Fact tables to migrate data from various sources to Enterprise Data Warehouse.
- Worked with EDW team and Data Architects to create reporting requirements for Operational Analytics.
- Worked with UNIX admins to open up firewalls and establish connectivity between application servers
- Performed feasibility, adaptability study and risk analysis to identify the business critical areas from the user perspective.
- Extensively interacted with ETL teams and developed data mapping documents to facilitate data loads from flat files
- Executed ETL jobs with Datastage Director to load data warehouse and data marts from disparate source systems.
- Interacted with DBA teams to run Teradata utilities such as tpump, multiload, Fastload, Fastexport
- Suggested and Implemented performance improvements by tweaking complex SQL queries
- Organized and code migration across different environments such as QA, Performance and UAT
Environment: Oracle 11g, Teradata, IBM Infosphere DataStage and QualityStage, Squirrel, Cognos 10.1, Autosys, UNIX, SQL Server 2008, ERwin, MS VISIO, Microsoft (Word, Access, Excel, Outlook).
Confidential, Chicago, IL
Sr. Data Modeler
Responsibilities:
- Discuss with reporting team and report users about the tables needed to be migrated and used Erwinto designLogical/Physical Data Models,relational database design, forward/reverse engineering, publishingdata modelto acrobat files
- Analyze the cubes, dtss/ssis packages running on SQL and Designed and developedStar Schemaand createdFactandDimension Tablesfor the TERADATA Warehouse using Erwin
- Experience with Independent verification and validation methodology (IV&V).
- Perform Database Administrators task on Teradata Development Database like create tables, collect stats, creating indexes, grant access, verify the distribution of data on all amps, load tables from a text file using the import option in T-SQL Assistant.
- Designed and developed Operational Data Store, Slowly Changing Dimension, Staging areas etc.
- DesignedInformaticaArchitecture for development, integration and system test environments
- Used CONTROL- Confidential job scheduler tool to schedule the jobs for informatica and BOBJ reports in production and familiarity with the concept of control- Confidential server, control- Confidential is a DELL preferred job scheduler.
- Familiarity with BRIO reporting tool and Brioquery as I had analyze reports running on BRIO 6.
- Informed the users about the expected difference between excel reports and BOBJ reports.
- Preparing Data Modeling requests to create table structures, to change existing tables per requirements an renaming existing tables.
- Transfering data from prod to dev and qa using SFTp for unit testing and QA testing
- Writing SQL scripts for creating views, triggers and sequencers.
- Working with ETL developers and providing them data mapping specifications to develop ETL jobs to load the data into target databases.
- Preparing test cases for Quality assurance testing, QA testing new tables and documenting test results.
- Creating table views for BI reporting purposes.
- Maintained legacy Perl scripts while refactoring the code and migrating it to a new environment
- Developed programming standards and custom Perl modules to increase script resilience and maintainability.
Environment: ERWIN 7.3, NEXUS for Teradata, T-SQL, TOAD for oracle, SQL Server Management studio, SQL Enterprise Manager, Informatica 8.6, Control- Confidential Job scheduler, BRIO, Oracle 9i, SQL, PL/SQL, Windows 2000, Teradata.
Confidential, Pittsburgh, PA
Data Modeler
Responsibilities:
- UsedErwinto designLogical/Physical Data Models,relational database design, Forward and reverse engineering, publishingdata modelto acrobat files, created ERWIN reports in HTML, RTF format depending upon the requirement, Published Data model in model mart, created naming convention files, co-coordinated with DBA to apply the data model changes.
- Involved in requirement gathering, analysis of the requirements from the business owners and users.
- Interacted with users for verifying user requirements, managing change control process, updating existing documentation.
- Designed and developedStar Schemaand createdFactandDimension Tablesfor the Warehouse using Erwin
- Involved in modifying Teradata stored procedures and performance tuning of various tables.
- Managing metadata by mapping dimensions and fact tables of various schemas/data models
- Generated reports like subject area reports, entity relationship diagrams, attribute reports, table reports, column reports, indexing reports, relationship reports using Erwin Tool.
- Actively involved in JAR sessions to identify what kind of data we need, how the data is going to be manipulated, & what reports are in the old warehouse, & what reports the new warehouse needs to generate.
- Analyzed business and end-user needs to ensure that project data requirements are identified and deliverables are met.
- Performed gap analysis between the AS-IS process & TO-BE process. I referred to the data dictionary document to conduct this analysis.
- Used gap analysis to do the data mapping document which is most essential in determining the source table, source columns, destination table, destination columns, gaps and define any business rule changes in order to fill up those gaps.
- Handed off the requirements to the ETL (extraction, transformation, & loading) developers, & worked with them on any questions they had.
- Performed Data Analysis and Data validation and data cleansing by writing complex SQL queries using TOAD against the Teradata database.
- Performed report validation writing SQL queries using TOAD on an Oracle database.
- Wrote complex SQL queries and PL/SQL procedures to extract data from various source tables of data warehouse.
- Performance tuning of SQL queries for data warehouse consists of many tables.
- Created PL/SQL packages, procedures and extensively used Arrays, PL/SQL tables, cursors, user defined object types, exception handling.
Environment: Informatica 8.1, Data Flux, Oracle 9i, Quality Center 8.2, SQL, TOAD, PL/SQL, Flat Files, Teradata V2R6, ERWIN 7.2, Microsoft Visio, Microsoft Project, Microsoft Office Suite, T- SQL, SQLSERVER 2005.
Confidential, Houston, TX
Data Modeler
Responsibilities:
- Created and review Requirement specifications and process flow diagrams including context and dataflow diagram.
- Created Logical Model based on the requirement documents with appropriate database objects naming conventions followed by industry best practices
- Defined the data sources, data loads and data transformation using ER Diagrams. Used MS-Visio for model diagrams, MS-Word for documentation and used Unified Modeling Language for visual modeling.
- PerformedDataModelingby creating Logical and Physical models of the database systemup to4thnormal forminERStudio.
- PerformedForward Engineeringto generateDDL scriptsandReverse Engineeringof the Data Models.
- Worked on maintenance of model using theCompleteCompareandUpdate Model features.
- Developed a logical Integration model detailing the flow of information through the various components, including definition of external and internal message contents.
- Did Gap analysis from beginning of the project till the end, to ensure the project is heading towards right direction.
- Published data model supporting documents in HTML report form andUploaded Data model diagrams and database scripts in SharePoint.
- Design, Debug and testing of ETL jobs inData integratortool and monitoring the jobstatus
- Designed message formats, queues, systems interfaces, and worked with the business users though interviews and JAD sessions.
- Facilitated the analysis and design phase, documented customizations, wrote test scripts for all processes and customizations, coordinated system test and tracking issues, and coordinated the cutover process to Production.
- Analyzed, collected and prepared user requirements, definition, scope and expectations for deliverable plans. Delivered system documents, processes, diagrams, and test cases.
Environment: Rational Suite (RequisitePro, ClearQuest, Clearcase),Enterprise Architect, MainFrame, COBOL, Informatica, MS Excel, MS Project, MS SQL Server 2005, Visual Basic, ER-Win 4.0,Data Integrator, Oracle 9i/10g, PL/SQL, UNIX,,MS-Visio, Windows NT.