Lead Data Modeler Resume
Bethesda, MD
OBJECTIVE
- Obtain a job opportunity as a Data Modeler in a Lead role
SUMMARY
- 7+ years technical experience in IT
- Worked in Healthcare, Telecommunications and Insurance domains
- Significant projects accomplished include EDW implementation, Data migration from Oracle to Redshift, and several data modeling initiatives
- Expertise in Ralph Kimball Dimensional Modeling, Inmon’s Normalization and Hybrid Data Modeling methods
- Experience in Data Modeling for Columnar databases such as AWS Redshift, Netezza, Teradata and NoSQL databases such as DynamoDB, MongoDB
- 7+ years total experience in Data Modeling using ErWin, ER Studio, Power Designer data modeling software as well as with Visio and draw.io
- Experience on ErWin 2020 R2, ErWin 9, and ErWin 7
- Experience of 6+ years on Data modeling and delivered conceptual, logical and physical data models for enterprise data warehouses and data marts
- Experience of 6+ years in reverse engineering from databases to generate data models and troubleshooting purposes
- 7+ years of experience in data analysis, data profiling, database normalization, and denormalization
- Expertise in tracking model updates, DDL changes on database, data dictionary maintenance and modifications to enable better Data Governance
- Merit - worthy experience in working together with BI Reporting users, ETL teams, Testing teams and DB Admins for Model implementation, deployment and support
- Intermediate-level experience on ETL development using AWS Lambda-Python, Glue, Informatica Power Center in importing data from source files in flat files, csv, json formats and from source operational databases
- Worked with Middleware, DBA Teams during deployments, error RCA, troubleshooting and Data Integration efforts
- Expertise in understanding business processes, managing projects and team members, providing technical and non-technical support services for clients
- Excellent communication skills, people-management skills, problem-solving attitude and troubleshoot the most complex situations
TECHNICAL SKILLS
Data Modeling Tools Database Management Systems: Erwin, Sybase Power Designer 11/13 AWS-Redshift, DynamoDB, Teradata, Oracle ER Studio MS SQL Server 2005/08
ETL Tools BI Tools: Informatica, AWS-Lambda, Glue Business Objects, Cognos, MicroStrategy
Scripting/Programming Operating Systems: Bash, Java, Python, C-Shell, PERL Windows 10, Linux RHEL, UNIX
Defect Management Web Technologies: JIIRA, Remedy, ClearQuest Javascript, PHP, MySQL, XML, JSP
PROFESSIONAL EXPERIENCE
Confidential, Bethesda, MD
Lead Data Modeler
Responsibilities:
- Currently working as a Lead Data Modeler and Analyst for Enterprise Architects
- Conduct JAD sessions with client’s business teams to assess reporting requirements, data model estimation, and timelines for project delivery
- Generated conceptual, logical and physical data models using ErWin 9.7 and ErWin 2020
- Conduct Data profiling using plain-SQL, Informatica Analyzer and MS Excel
- Imported data using Python, Informatica ETL & AWS Lambda code from source databases into Dev-Stage environment analysis purposes
- Created 3NF models for OLTP data stores
- Developed MS-Excel VBA code to do Data element catalog maintenance, updating ErWin entity-attribute metadata management
- Organized data model walkthrough sessions to review data models, S2TM and DDL structures with Data Engineers, ETL Developers and Testers
- Involved in Unit Testing, QA and UAT for reporting teams supporting data model, DDL deployment in database and modifying models immediately
- Provided on-call support during deployment migration window
- Created Source to Target mappings (S2TM/STTM) files, data model data dictionaries to help in ETL Development
- Implemented new feature of column-level grant statements for security controls on Redshift database to enable PII security features
- Played a significant Role in migrating data from Oracle to Redshift using Schema Conversion Tool, and Data migration jobs
- Analyzed Oracle PL/SQL Stored procedures in legacy Oracle data transfer jobs to understand table and data relationships
- Conducted Training sessions to business users and technical SMEs for Data model navigation, cube analysis, data marts and giving SQL queries
Confidential, Columbus, OH
Sr. Data Modeler
Responsibilities:
- Worked as a Data Modeler for the Pharma and Medical teams at Confidential
- Played a major role in reverse engineering, performance assessment and tuning of existing data models based on Teradata
- Followed Agile methodologies based on Scrum functionalities assigned to team as Stories and delivered data models as per planned cycles
- Delivered data models as part of Agile sprints by working closely with the Developers and Testers
- Used Sybase Power Designer to create and maintain Logical and Physical data models in a team of 5 data modelers
- Created dimensions and facts as physical tables using Ralph Kimball methodologies against Teradata, Oracle and AWS-Redshift
- Conducted data profiling on source incoming data using direct SQL, Excel, and Data Stewards’ applications for better metadata and data governance
- Worked with Data Engineers to design and build AWS ETL data pipelines to help understand the data models and surrogate key implementation
- Created source to target mapping (S2TM) documents for data model to Staging tables designs to help ETL teams in data loading processes
- Conducted meetings with business user teams for modeling requirements, business processes documentation, UAT and usability validations
- Created data models for Medical and Claims lines of businesses
- Enhanced existing naming convention files for abbreviation corrections, auto-conversion during physical data model conversion
- Used Excel-based automation tools to create Logical and Physical data models
- Trained junior data modelers, analysts and technical users on Data modeling concepts, normalization, data mart, and general SQL analysis
Confidential, Columbus, OH
Data Modeler
Responsibilities:
- Worked in a lead consultant role to suggest, maintain, modify Claims data model for improvements and optimization
- Created data models for Members profiles, eligibility and Providers
- Loaded data from providers’ and payers’ files to do De-duping of data, and deriving correct data cardinality relationships among the tables
- Followed Agile Sprint methodology with data models delivered in Sprint 0 along with architecture design and updates to models in simultaneous sprints
- Generated logical and physical data models for Providers, payers, and member profiles using ErWin 9.7 and SQL Server, Oracle as target databases
- Used ER Studio as the data modeling tool
- Developed SQL query mechanism for data de-duplication during importing into target data model tables
- Conducted data profiling, data analysis and establishing Primary and Foreign key relationships
- Converted existing data models in Visio format to ER Studio while preserving all design and development
Confidential, Columbus, OH
Data Modeler
Responsibilities:
- Worked as a Data Modeler to design, develop and modify the data models of the Enterprise Innovation team
- Co-ordinated with the EI team’s stakeholders to understand BI & Analytics requirements for Data and Analytics reports
- Created logical and physical Data models using ErWin software
- Implemented ErWin UDPs across multiple data models to enhance better documentation, security roles generation
- Developed ETL code for initial data analysis, data profiling to establish correct data types, lengths, primary key and foreign key relationships
- Followed traditional SDLC-Waterfall method for Data Model delivery
- Modified existing Netezza tables DDL for performance tuning efforts
- Created semantic views, tables and SQL query reports for Presentation Layer that works against reporting layers
- Created Conceptual models in ErWin and draw.io to get business users’ understanding of their current legacy systems
- Lead a team of 6 data modelers and address data model repository maintenance, exclusive locking, and ETL Development issues
- Played a key role in migrating data from legacy Oracle to Redshift database
- Used SCT (Schema Conversion Tool) to generate Redshift equivalent DDL and make record updates
Confidential, Columbus, OH
Data Modeler
Responsibilities:
- Worked as a Physical Data Modeler to design, develop and modify the data models of the Customer Data Warehouse team
- Worked in an Agile setup and delivered data models every sprint under rigid schedules
- Analyzed data from multiple data warehouses including internal data warehouses as well as transactional data and vendor data from Excel csv files
- Standardized data storage and access mechanisms on physical tables using Primary Index and Secondary index definitions on tables
- Created new “nsm” naming conventions files by incorporating existing Teradata DW names, business jargon and data governance policies
- Worked with DBA teams to setup Development environment and loaded sample records from staging sources to get better data validations
- Conducted data analysis and profiling for finding correct data type definitions and used them in target table DDL statements
- Worked with Testing and ETL team developers in advance to get project deliveries done on time with better UAT and less errors
- Standardized the Teradata config file in ErWin software for data domains, types, and lengths including UDP validations
- Converted logical data models to physical data models for Teradata V12
- Created views, when possible to avoid unnecessary space wastage
- Created logical and physical Data models using ErWin software
- Implemented ErWin UDPs across multiple data models to enhance better documentation, security roles generation