We provide IT Staff Augmentation Services!

Data Modeler Resume

5.00/5 (Submit Your Rating)

Nashville, TN

PROFESSIONAL SUMMARY:

  • 10+ years of experience in Data Modeling, business and Data Analysis, production support, Database Management, strategic analysis, requirements gathering, data mapping and data profiling.
  • Worked with Technical Architects and Database analysts for the Design of Summary tables required for efficient Report Design.
  • Understanding in development of Conceptual, Logical and Physical Models for Online Transaction Processing and Online Analytical Processing (OLTP & OLAP).
  • Experience in developing Entity - Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer
  • Experience in writing SQL queries and optimizing the queries in Oracle, SQL Server, Netezza, Teradata and Big Data.
  • Experience in automating the infrastructure in AWS using web services.
  • Experience in building policies for access control and user profiles using AWS IAM, S3 controls with bucket policies.
  • Expertise in designing Star schema (identification of facts, measures, and dimensions),Snowflake schema for Data Warehouse, ODS architecture by using tools likeErwin data modeler, Power Designer, and E-R Studio.
  • Analyze raw data from internal and external data and demographic sources and apply data mining techniques.
  • Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies
  • Experience in designing star schema, Snowflake schema for Data Warehouse, ODS architecture.
  • Expert in building Enterprise Data Warehouse or Data warehouse appliances from Scratch using both Kimball and Inmon Approach.
  • Good understanding and hands on experience with AWS S3, EC2 and Redshift, Document based databases.
  • Strong background in various Data Modeling tools using ERWIN, ER/Studio and Power Designer.
  • Experienced in designing, building and implementing complete Hadoop ecosystem comprising of Map Reduce, HDFS, Hive, Impala, Pig, Sqoop, Oozie, HBase, MongoDB, and Spark.
  • A good experience in Developing Data governance model and standard operating procedures to manage over 10,000 data elements being tracked by dashboards
  • Expertise in creating DDL scripts for implementing Data Modeling changes.
  • Have hands on experience in writing MapReduce jobs using Java, Managing Single node and Multi node Cluster Configurations.
  • Good understanding and hands on experience in setting up and maintaining NoSQL Databases like Cassandra, Mongo DB, and HBase, COSMOS
  • Experienced in creating, managing and delivering server based reports with interactive views that provide valuable insight for business heading using Tableau, MicroStrategy and Tibco Spotfire.
  • Expertise in Normalization (1NF, 2NF, 3NF and BCNF)/De-normalization techniques for effective and optimum performance in OLTP and OLAP
  • Excellent knowledge in Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
  • Good knowledge in SQL/ PL SQL programming and developed Stored Procedures and Triggers

SKILLS TABLE:

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, VBScript, PERL, AWK, SED

Databases: Oracle 12c/11g/10g/9i, Teradata R12, R13, R14, R15, MS SQL Server 2005/2008, MS Access

Tools: MS-Office suite (Word, Excel, MS Project and Outlook), VSS

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Load Runner, Quick Test Professional, Performance Center, VU Scripting, Business Availability Center), Requisite, MS Visio & Visual Source Safe, Salesforce.

Operating System: Windows Vista/XP/2000/98/95, Dos, Unix

ETL/Datawarehouse Tools: Informatica 9.5/9.1/8.6.1/8.1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), SAP Business Objects XIR3.1/XIR2, Web Intelligence

Data Modeling: Star-Schema Modeling, Snowflake-Schema Modeling, FACT and dimension tables, Pivot Tables, Erwin and SAP Power Designer.

Tools & Softwares:: TOAD, MS Office, BTEQ, Teradata SQL Assistant

Business Intelligence tools: SSDT-BI, SSIS/SSRS/SSAS solutions

PROFESSIONAL EXPERIENCE:

Confidential, Nashville, TN

Data Modeler

Responsibilities:

  • Understood the business need and Project requirements in a large data analytics product.
  • Worked as a Data Architect / Modeler to generate Data Models using SAP PowerDesigner and developed relational database system.
  • Worked on Software Development Life Cycle (SDLC) with good working knowledge of testing, Agile methodology, disciplines, tasks, resources and scheduling.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Developed logical data models and physical database design and generated database schemas using SAP PowerDesigner.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Worked with delivery of Data & Analytics applications involving structured and un-structured data on Hadoop based platforms on AWS EMR
  • Worked on MongoDB database concepts such as locking, transactions, indexes, Sharing, replication, schema design.
  • Provided guidance and solution concepts for multiple projects focused on data governance and master data management.
  • Used Hadoop eco system for migrating data from relational staging databases to big data.
  • Responsible for Big data initiatives and engagement including analysis, brain storming, POC, and architecture.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
  • Developed and maintained the data dictionaries, Naming Conventions, Standards, and Class words Standards Document.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • Did PoCs on LogStash and Elastic Search for ETL processes.
  • Did PoC on Octopus for code deployment.
  • Used SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.
  • Backed up databases using Mongo DB backup facility in OPS manager.
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Worked on TSQL log4 frame work for logging events and exception handling
  • Implemented strong referential integrity and auditing by the use of triggers and SQL Scripts.
  • Trained junior data modelers on data architecture standards and data modeling standards.

Confidential, Boston, MA

Data Modeler

Responsibilities:

  • Understood the business need and Project requirements in a large data analytics product.
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Loaded data into Hive Tables from Hadoop Distributed File System (HDFS) to provide SQL access on Hadoop data.
  • Worked on Hadoop ecosystem, hive queries, MongoDB, Cassandra, Pig, and Apache Strom.
  • Loaded and transformed large sets of structured, semi structured and unstructured data using Hadoop/Big Data concepts.
  • Involved in Data Modelling activities in the project
  • Completed ASPIRE healthcare provided by EMIDS.
  • Attended numerous s to understand the Healthcare Domain and the concepts related to the project (Healthcare Informatics).
  • Collaborated with other data modelers to understand and implement best practices within the organization.
  • Used ER studio tool extensively for database design and modeling.
  • Used LIQUIBASE tool effectively for restoring and maintaining SQL scripts.
  • Took basic on Amazon Web Services provided by the client.
  • Worked with Technical leads to implement AURORA DB Cluster and manage the Databases required by the project.
  • Conducted numerous POCs (Proof of Concepts) to efficiently import large data sets into the database from AWS S3 Bucket.
  • Improved database performance and query execution by indexing and portioning.
  • Completed a POC (Proof of Concept) to efficiently convert .CSV to PARQUET file format.
  • Learnt COLUMNAR data format as part of the above POC.

Confidential, Richmond, VA

Data Analyst/Modeler

Responsibilities:

  • Worked on subject areas such as SSR, Trauma, NSQIP-Adult, NSQIP-Peds, Bariatrics and CANCER
  • Worked with C&R team and directly with clients to gather business requirements and converted those to technical requirements.
  • Worked on Aptify front end for various data migration, data validation and testing activities.
  • Researched, evaluated, architect, and deployed new tools, frameworks, and patterns to build sustainable Big Data platforms for our clients
  • Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS and Extracted the data from Oracle into HDFS using Sqoop.
  • Designed Enterprise Logical Data Model, for specific subject areas based on the requirements.
  • Assisted Data Analyst with source to Target mapping documents and API specification documents.
  • Designed a data model for Trauma using ErWin and created conceptual, logical and physical data models.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.

Confidential, NJ

SQL Developer - Intern

Responsibilities:

  • Worked as SQL Developer intern in Keshav Memorial Institute of Technology
  • Learned how to write Complex Stored Procedures
  • Did PoC in requirement gathering for projects
  • Learned Reverse Engineering and Forward Engineering
  • Learned how to write Complex Views.
  • Designed three Databases as Part of the college project
  • Learned how to create Crystal Reports and Tableau
  • Did data analysis on various subjects to learn Data profiling.
  • Implemented Data conversion and data migration techniques when necessary through the project life cycle.
  • Worked on master-child stored procedures to make automated execution with less complex.
  • Utilized SQL Profiler to identify block, deadlock in transactions and to implement isolation level, lock on transactions to avoid block and deadlock.
  • Worked on table Index analysis and came up with more efficient solutions by using Clustered and Non-Clustered Indexes for significant performance boost.
  • Created Pre-Staging and Staging database as intermediate stages for populating data mart and to perform ETL data massage operations, e.g., data profiling and data cleansing.
  • Worked with various professors to learn more on real time business scenarios.
  • Did Proof of Concept, where I moved the data from Source to Target using Informatica ETL tool.

We'd love your feedback!