Data Analyst/ Data Modeler Resume
SUMMARY
- Over 7 years of IT Experience in analysis, design, development, implementation and troubleshooting of Data Mart / Data Warehouse applications using ETL tools like SSIS, OLTP, OLAP, Business Objects, Tableau, Informatica power center, Power BI.
- Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts, and Decision Support Systems (DSS) using Multidimensional Model (Kimball and Inman), Star and Snowflake schema design addressing slowly Changing Dimensions (SCDs).
- Experience in developing and maintaining logical and physical Data models for Data Warehouse applications using tools like Erwin and ER/Studio.
- Extensive integration experience as Data analyst by gathering data from different sources, Data profiling, Data analysis, Data quality, Data Governance and Data management.
- Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, and XML files, Teradata, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server, and AWS redshift as targets.
- Conducted data model reviews with business analysts, SME’s, PO’s, and business users to ensure that model meet requirements.
- Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, datatypes, volumetric, domain definitions, and corporate meta - data definitions.
- Expertise in implementing complex Business rules by creating robust ETL workflows using MS SQL Server Integration Services (SSIS) in a SSDT SQL Server Data Tools 17.0.s
- Extensively worked with Oracle PL/SQL Stored Procedures, Functions and Triggers and involved in Query Optimization.
- Experienced in creating dashboards in Power BI/Tableau.
- Good Knowledge on Views, synonyms, Indexing, Joins, Ranking, and Partitioning.
- Profound knowledge on utilities like Fast Export, Fast Load, Multiload (Mload).
- Developed BTEQ scripts to Load data from Teradata Staging area to Data warehouse, Data warehouse to data marts for specific reporting requirements.
- Good experience in performing and supporting Unit testing, System Integration testing, UAT and production support for issues raised by application users.
- Developed UNIX scripts using PMCMD utility and scheduled ETL load using utilities like CRON tab, Maestro, Control-M.
- Knowledge of standardized Acord XML formats used in Insurance companies.
- Experienced in Agile methodologies and usage of JIRA, TFS, and Rally.
- Versatile team player with excellent analytical, presentation and interpersonal skills with an aptitude to learn new technologies.
- Excellent technical and professional client interaction skills. Interacted with both Technical, functional and business audiences across different phases of the project life cycle.
TECHNICAL SKILLS
Databases: Oracle 7.x/8.x/9i/10g/11i, SQL Server 2008/2005/2003/2000 , Teradata V2R4 V2R5, DB2 UDB 7.2 SQL Assistant, My SQL 5.0/4.1, MSSQL, Oracle, Sybase
ETL Tools: Informatica (Power Exchange/Power Center/Power Mart 9.0.1,8.6.1,8.5,8.1.11,)) Informatica Data Quality Workbench 8.5, SSIS)
Data Modeling Tools: ERWIN, ER/Studio, Toad Data Modeler.
Programming Skills: C++, Shell Scripting (K-Shell, C-Shell), PL/SQL, FORTRAN, JAVA (Eclipse IDE and Net Beans IDE), HTML, JAVA Script, J2EE, CSS
Methodologies: Data Modeling - Logical, Physical Dimensional Modeling - Star / Snowflake.
Reporting Tools: MS SQL Server Reporting services, SSRS, Tableau (10.3.2), POWER BI
Operating Systems: UNIX (Sun-Solaris, HP/UX), Windows 95/98/00/NT/XP.
PROFESSIONAL EXPERIENCE
Confidential
Data Analyst/ Data Modeler
Responsibilities:
- Working on multiple projects with different business units.
- Provided Subject Matter Expertise to System Analyst Leads, Developers and Testers.
- Provided investigation and root cause analysis support for operational issues and Application Support.
- Responsible for gathering requirements with Business Analysts, Systems Analysts, Developers and DBA’s and translated them into detailed reporting requirements.
- Created and maintained Logical/Physical data models.
- Reverse Engineering data models of OLTP systems for further understanding and enhancements.
- Created Data Dictionary and documented mapping rules for ETL data pipelines.
- Handled Code reviews of the component to reduce the bug density of the component.
- Generated DDL’s and created different business views required for business reporting for AWS redshift databases.
- Responsible for performance tuning of queries pulling the information from AWS views.
- Participating in all agile related activities and highlighting risk factors to ensure MVP deliverables.
- Efficient is using Rally and TFS tools for agile related activities.
Environment: ERWIN, MS SQL Server, Oracle, Microsoft Excel, Bit Bucket, Share Point, TFS, Rally.
Confidential
Data Analyst/ Data Modeler
Responsibilities:
- Responsible for Refining use cases and backlog requirements provided by Product Owner, Business Analysts, Scrum master in creating user stories in JIRA boards from design team.
- Experience in organizing meetings for understanding business requirements for target DWH design involving SME, Product Owner, Data Architect and BI developers.
- Assisting Data Architect in data analysis, data profiling, and data quality and in maintaining the granularity of target data models.
- Creating simple & complex mapping rules to load dimension & fact tables as per STAR Schema techniques.
- Work with business users together and understand functional requirements develop complex queries and provide reports.
- Provision of expert information governance, data management and security guidance across the Stakeholder community
- Responsible for analyzing simple and complex logic involving calculations of Ceded, Gross, Re-insurance premium amounts for fact tables for Power BI Reporting.
- Acting as mentor to give KT sessions for broader audience involving ETL developers and testing team.
- Experienced in Reverse Engineering/ Forward engineering using ERWIN tool.
- Deploying DDL’s for AWS redshift table creation in DEV, UAT, and PROD.
- Creating Redshift business views and deploying them into PROD and UAT.
- Experienced in clearing UAT and PROD release defects in order to improve on the quality of the product and meet the project milestones within timely manner.
- Experience in developing bidirectional cross-filtering in Power BI Desktop, Shape and combine multiple data source, importing and analyzing data from a web page using Power BI Desktop.
- Analyzing data from excel and an OData feed, connect with Power BI Desktop (DXA formula compatibility in Direct Query Mode).
- Experienced in highlighting priorities, deadlines and blockers in Scrum meetings for completion of user stories.
- Solid understanding of Business Process definition, Risk Analysis and Agile methodologies.
Environment: ERWIN, Sybase, Oracle, AWS Redshift, Microsoft Excel, Bit Bucket, Share Point, JIRA.
Confidential
Data Modeler
Responsibilities:
- Responsible for designing a staging data model for automatization for policy files and loading them into legacy systems.
- Involved in various JAD meetings for converting Business level specifications into Data specifications/data models.
- Involved in sessions of Data Governance for processing manual files.
- Experience in business process data modeling using forward and reverse engineering concepts in ERWIN.
- Conduct reviews of Data assets and controls as part of monitoring capability and leading to improvements in data management practices.
- Review of Data Models with entire team.
- Deploying DDL scripts into DEV, UAT, and PROD regions.
- Experience in preparing Mapping documents for integration of new data files to the already existing legacy systems.
- Responsible for creating Data Dictionaries for different Hanover products (HHC, MPL, LPL).
- Responsible for deploying data models to enterprise data model repositories.
- Experience working with global teams.
- Gained knowledge of Acord XML files used in Insurance domains.
- Experience working in software cycles (water fall method).
Environment: ERWIN, Oracle, DB2, Microsoft Excel, Acord XML.
Confidential
Sr. Data Analyst
Responsibilities:
- Responsible for gathering business requirements.
- Developing new classification rules for the available training data sets.
- Creating classification data models using classification Transformation in Informatica Data Quality (IDQ).
- Design, Development and implementation of Informatica Developer Mappings for data cleansing using Address validator, Labeler, Parser, Expression, Filter, Router, Lookup transformations etc.
- Extensively used Informatica Functions LTRIM, RTRIM, DECODE, ISNULL, IIF, IS DATE in data cleansing that is coming from legacy.
- Effectively worked on Repository Manager, Mapping Designer, work let manager, Workflow Manager, and Monitor
- Evaluate effectiveness of already existing classification rule sets and present recommended modifications to improve system performance.
- Extensively used the Teradata utilities like BTEQ, Fast load, Multiload, DDL Commands and DML Commands (SQL).
- Responsible for monitoring automated BTEQ scripts and troubleshooting log errors.
- Performed tuning and optimization of complex SQL queries using Teradata Explain and Run stats.
- Created a BTEQ script for pre-population of the work tables prior to the main load process.
- Created Primary Indexes (PI) for both planned access of data and even distribution of data across all the available AMPS. Created appropriate Teradata NUPI for smooth (fast and easy) access of data.
- Involved in regressive testing of ETL workflows and classification transformation models for better score cards.
Environment: SQL Server, T-SQL, ETL, Teradata, Microsoft SQL Server, Microsoft Excel, Informatica 9.X, Informative Data Quality (IDQ), Informatica Life Cycle Management, Power BI.
