- Result - oriented IT professional experience with strong background in Database development and designing data models for online transactional processing (OLTP) and online Analytical Processing (OLAP) systems for Health Care, Financial, Insurance & Telecom domains.
- Strong background in various Data Modeling tools using ERWIN, IBM Data Architect, Power Designer, MS Visio.
- Expert knowledge in SDLC (Software Development Life Cycle) and good experience in the field of business analysis, reviewing, analyzing, and evaluating business systems and user needs, business modeling, document processing.
- Good understanding of AWS, big data concepts and Hadoop ecosystem.
- Extensive experience in Relational Data Modeling, Dimensional Data Modeling, Logical data model/Physical data models Designs, ER Diagrams, Forward and Reverse Engineering, Publishing ERWIN diagrams, analyzing data sources and creating interface documents.
- Extensive ETL testing experience using Informatica, Talend, Pentaho.
- Designed and developed Data Marts by following Star Schema and Snowflake Schema Methodology, using industry leading Data Modeling tools like Erwin.
- Participated in JAD sessions, created Use Cases, work flows and Power Point presentations.
- Gathering and translating business requirements into technical designs and development of the physical aspects of a specified design.
- Gathered and documented Functional and Technical Design documents.
- Have experience in extensive data profiling & Analysis.
- Worked extensively on Data Governance. Implemented metadata management to achieve regulatory compliance and quality in business intelligence.
- Have worked with ETL tools to extract, transform and load data from relational databases and various file formats and loaded to target database.
- Excellent knowledge in SQL and coding PL/SQL Packages, Procedures.
- Experience in SQL Performance Tuning and Optimization (Design, Memory, Application, IO) and using Explain plan, Tracing and TKPROF.
- Experience in creating Materialized views, Views, Lookups for the Oracle warehouse.
- Experience with Erwin model manager.
- Experience working with Master Data Management
- A dedicated team player with excellent communication, organizational and interpersonal skills.
- 7+ strong experience with data profiling tools such as Informatica Data Quality (IDQ)
DBMS: Oracle, Hadoop Big Data, Hive, NoSQL, MS SQL SERVER, Teradata, Netezza, (Cloud Platform) AWS, Amazon Redshift Spectrum, Snowflake
Database Specialties: Database Architecture, Data Analysis, Enterprise Data Warehouse, Database Design and Modeling, Data extraction, Integration, transformation and Migration, ETL Architecture and Design Data Warehouse, OLTP, OLAP, Report Design and Development.
Data Analysis: User Requirement Gathering, JAD Sessions, Gap Analysis, Data Cleansing, Data Transformations, Data Profiling, Source Systems Analysis, Data Relationships
Data Modeling: Erwin, Info sphere Data Architect (IDA), MS Visio, Toad, Power Designer, ER Studio
ETL Tools: Talend, Informatica, Data Stage
Reporting Tools: Tableau
Programming: PL/SQL, T-SQL
Scripting languages: Java script, PHP
Web technologies: HTML, XML
Microsoft Office: Word, MS Excel, Access
Confidential, Columbus, OH
- Derive data requirements and support solution design and development estimation efforts.
- Profile source systems using SQL queries and Informatica Data Quality to determine data types, data lengths and relationships.
- Determine data security and compliance requirements; data archival and purge policies, data quality and integrity constraints etc.
- Collaborate with architects to review candidate architecture for projects in Enterprise Architect.
- Experience in Dimensional Data Modeling, Star/Snowflake schema, FACT & Dimension tables.
- Assess system impacts due to data migration from legacy databases to enterprise data warehouse.
- Involve in architecting data migration and delivery solutions surrounding Hadoop implementation.
- Develop and/or Implement data matching process for attributes to be mastered in Informatica MDM.
- Expertise on Relational Data modeling (3NF) and Dimensional data modeling.
- Identify, validate and leverage sources and target database schemas, ensuring conformity and reusability.
- Involve in data lineage and Informatica ETL mapping development, complying with data quality and governance standards.
- Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLTP, ODS and OLAP.
- Create and enhance future-friendly logical data models and conduct design walkthroughs with internal teams and end users.
- Generate physical data models and DDL scripts for database objects, incorporating enterprise standards and industry best practices for the target database.
- Worked with data compliance teams, data governance team to maintain data models, Metadata, data Dictionaries, define source fields and its definitions.
- Create and administer database objects such as tables, sequences, history triggers, stored procedures etc., in sandbox environments for preliminary development and testing.
- Support database implementations, performance tuning such as query execution plan, data distribution and partitioning, issue resolution and clean-up efforts.
- Extracting Mega Data from Amazon Redshift, AWS, and Elastic Search engine using SQL Queries to create reports.
- Created various Physical Data Models based on discussions with DBAs and ETL developers.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complex EDW using Informatica.
- Develop VB based ER/Studio modeling tool macros to improve productivity in customizing COTS data models such as Guidewire Policy Center, Claim Center etc.
- Experience in Big Data, NoSQL Database like Cassandra and technical expertise in Hadoop.
- Research and develop data management and modeling standards for platforms that support unstructured data such as Hadoop.
- Defined and deployed monitoring, metrics, and logging systems on AWS.
- Selecting the appropriate AWS service based on data, compute, database, or security requirements.
- Working on Amazon Redshift and AWS and architecting a solution to load data, create data models and run BI on it.
- Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Extensively used Star and Snowflake Schema methodologies.
- Coordinate with offshore ETL development and testing and reporting teams.
- Document and track data model changes involved in production defect fixes to retrofit database structures across development environments.
- Validate that semantics of data elements being reported align with business requirements.
- Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.
- Perform data model repository check-in and enterprise metadata releases to the corporate portal, MetaCenter.
- Identify confidential and PII data elements and involve in enforcing appropriate protective measures such as tokenizing, masking, redacting etc. for data in flight and at rest.
Environment: IBM Info sphere Data Architect (IDA), AWS, Hive, HDFS, Netezza, Teradata, Oracle 11g, Snowflake, Big Integrate, Toad, Netezza, NoSQL, SQL Server
Confidential, Chicago, IL
- Performed logical and physical data modeling using Erwin for Operational systems and Enterprise Data Warehouse, for projects such as Know Your Customer that focused on analyzing clients’ transaction habits.
- Analyzed legacy system; reverse engineered and refined existing data source models by standardizing naming of database objects such as table prefix/suffixes, indexes and relationship names etc.
- Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using Erwin.
- Involved in capturing data lineage, table and column data definitions, reference data, valid values, and other relevant information in the data models.
- Used Normalization (1NF, 2NF & 3NF) and Denormalization techniques for effective performance in OLTP and OLAP systems.
- Experience in integration of various relational and non-relational sources such as DB2, Teradata, Oracle, Netezza, SQL Server, NoSQL, XML and Flat Files, to Netezza database.
- Applied Data Governance rules (Class words, Abbreviations, Data Domains, Data Definitions, and Metadata Information).
- Identified the affected business processes, the facts & dimensions; grain of fact, aggregate tables as part of data mart design.
- Identified and implemented missing business rules in the database using check constraints & triggers to verify transaction commit rules in coordination with DBAs.
- Extensively used Star and Snowflake Schema methodologies.
- Migrated data from legacy data management system to Agile PLM system.
- Involved in complete SQL Server Integration Services (SSIS) life cycle in building and deploying packages in Development and Production environments to import and export data to/from spreadsheets, SQL Server tables and flat files.
- Configured daily batch loads (Full & Incremental) into Staging and ODS areas, troubleshooting issues and handling events and errors using SSIS.
- Coordinated in Performance Tuning (SQL Tuning, Application/ETL Tuning).
- Written SQL queries for data retrieval and testing in development databases prior to migration.
- Used various transformations in SSIS Dataflow, Control Flow using for loop Containers and Fuzzy Lookups etc.
- Developed reports such as Parameterized reports, ad-hoc reports, Drill Down, Drill through, Cross tab reports in SSRS.
Environment: Erwin, DB2, Teradata, Oracle, Netezza, SSIS, SSRS, SSMS, Toad, NoSQL, SQL Server, Talend Data Governance
Sr. Data Analyst
- Conducted JAD sessions, gathered information from Business Analysts, SQL Developers, end users and stakeholders to determine the requirements or various systems.
- Created and documented the models, data Workflow Diagrams, Sequence Diagrams, Activity Diagrams and field mappings of all existing system.
- Proficient in using ER Studio Reverse engineering, Forward engineering and Complete Compare functions to create, updated and resolve issues with existing data models.
- Created and implemented ER Studio Naming Standards and Domain in the design of creation of logical and physical data models in accordance with the company’s standards.
- Worked with other Data Modeler, DBAs and Developers in my projects using Microsoft Team Foundation Server to ensure that every aspect of the project is effectively considered and completed and on a timely manner.
- Prepared and presented graphical representation forms of Entity Relationship Diagrams with elicit more information.
- Created and maintained, conceptual, logical and physical data models that define the information required for management and business intelligence purposes using Erwin Data Modeler.
- Worked with the DBAs and SQL Developers to deploy and maintain the developed physical data models.
- Used Erwin data modeler to design and implement the company’s health care domains
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from the source and SQL Server database systems.
Environment: ER Studio Data Architect, MS SQL Server 2012, 2014, ServiceNow, Sales Force, MS Visio, Microsoft Visual Studio 2012, 2013, Microsoft Team Foundation Server