Sr. Data Modeler Resume
Reston, VA
SUMMARY
- Over 8+ years of IT industry knowledge with hands on working experience in Data Modeling .
- Experience with all stages of the SDLC and Agile Development model right from the requirement gathering to Deployment and production support.
- Experience in Logical Data Model (LDM) and Physical Data Models (PDM) using Erwin, ER/Studio and Power Designer data modeling tool.
- Working Experience on Azure Cloud Services, Azure Storage,and SQL Azure.
- Effective development and support knowledge on Oracle, SQL, PL/SQL, T - SQL queries
- Good working on AmazonWeb Services:AWS Redshift, and AWS EMR.
- Knowledge in modelingOLAP systems using Kimball and Bill Inmon Data warehousing methodology.
- Experience in Big Data/Hadoop, Data Analysis, Data modeling professional with applied information Technology.
- Strong Experience in Entity Relationships and Dimensional Data Modeling to deliver NormalizedER andStar/Snowflake schemas.
- Hands on experience on data modeling with Star schema and Snowflake schema.
- Expertise in SQL Server Analysis Services (SSAS), SSIS and SQL Server Reporting Services (SSRS)
- Hands on experience in Normalization and De-normalization techniques for effective and optimum performance in OLTP and OLAP environments.
- Good experience on Relational Data modeling (3NF) and Dimensional data modeling.
- Strong experience in Data Migration, Data Cleansing, Transformation, Integration, DataImport, and Data Export.
- Extensive experience working with XML, Schema Designing and XMLdata.
- Strong background in Database development and designing ofdatamodels for different domains.
- Experience in designing interactive dashboards, reports, performing ad-hoc analysis and visualizations using Tableau and Power BI.
- Experience in extracttransform and load(ETL) processing large datasets of different forms including structured, semi - structured and unstructured data
- Execute change management processes surrounding new releases of SASfunctionality
- Strong background in data processing, data analysis with hands on experience in MS Excel,MSAccess, UNIX and Windows Servers.
- Working in reporting using Business Objects, Crystal reports, Designed WEBI /CR reports.
- Good documentation skills and was involved in making functional and technical documentation.
TECHNICAL SKILLS
Data Modeling Tools: Erwin Data Modeler, Power Designer and ER/Studio.
Cloud Platforms: AWS ( EC&S3, Redshift) MS Azure(Data Lake, Data Bricks, Data Factory
Methodologies: JAD, (SDLC), Agile, Waterfall Model
Reporting Tools: SSRS, SSIS, Tableau, SSAS, MS-Excel.
OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9
Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.
Operating System: Windows, Unix, Sun Solaris
ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2.
Programming Languages: SQL, PL/SQL, XML and VBA.
PROFESSIONAL EXPERIENCE
Confidential - Reston, VA
Sr. Data Modeler
Responsibilities:
- Worked as a Data Modeler for both OLTP and Data warehousing environments.
- Collaborated with and across agile teams to design, develop, test, implement, and support technical solutions in Microsoft Azure Data Platform.
- Installed, configured and maintained data processing pipelines.
- Worked with data governance team to maintain data models, Data Dictionaries; define source fields and its definitions.
- Involved in Relational and Dimensional Data modeling for creating Logical and PhysicalDesign of Databaseand ER Diagrams with all related entities and relationship.
- Designed, deployed, maintained and lead the implementation of Cloud solutions using Confidential Azure and underlying technologies
- Involved in data from various Source Systems like OracleandFlatFiles as per the requirements.
- Worked at conceptual/logical/physical data model level using Erwin Data Model according to requirements.
- Used ETL to develop jobs for extracting, cleaning, transforming and loading data into datawarehouse.
- Integrated Custom visuals based on business requirements using power BI desktop.
- Implemented the Big Data solution using Hadoop, Hive to pull/load the data into the HDFS system.
- Generated JSon files from the JSon models created for Zip Code, Group and Claims using Snowflake DB.
- Involved in several facets of MDM implementations including Data Profiling, Metadata acquisition and data migration.
- Designed and implemented database solutions in Azure SQL Data Warehouse and Azure SQL.
- Built various graphs for business decision making using Python matplotlib library.
- Developed requirements, perform data collection, cleansing, transformation, and loading to populate facts and dimensions for data warehouse
- Designed and developed T-SQL stored procedures to extract, aggregate, transform, and insert data
- Involved in moving all log files generated from various sources to HDFS for further processing through Flume.
- Developed JSON Scripts for deploying the Pipeline in Azure Data Factory (ADF) that process the data using the SQL Activity.
- Developed scripts that automated DDL and DMLstatements used in creations of databases, tables, constraints, and updates.
- Developed dimensional model for Data Warehouse/OLAP applications by identifying required facts and dimensions.
- Involved in ETL and processing data transfer from RDBMS to HDFS (Hadoopenvironment).
- Involved in ETL, Data Integration and Migration.
- Implemented Azure Data bricks clusters, notebooks, jobs and auto scaling.
- Created XML parser for converting the XML data to CSV and then load the data into the datamodel after validating the data.
- Established uniform Master Data Dictionary and Mapping rules for metadata, data mapping and lineage.
- Developed MapReduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
- Loaded the tables from the azure data lake to azure blob storage for pushing them to snowflake
- Implemented Kafka High level consumers to get data from Kafka partitions and move into HDFS.
- Loaded the dataset into Hive for ETL Operation.
- Created Use Case Diagrams using UML to define the functional requirements of the application.
- Used JIRA to track issues and Change Management.
- Worked on the reporting requirements and involved in generating the reports for the DataModel using crystal reports
- Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy Oracle and SQL Server database systems
- Involved in creating dashboards and reports in Tableau.
Environment: Erwin Data Model, Azure, SQL, ETL, Flume, Power BI, PL/SQL, Hive3.1.2, T-SQL, MDM, Agile, Oracle12c, Hadoop, OLAP, OLTP, Tableau.
Confidential - Newport Beach, CA
Data Modeler
Responsibilities:
- Understood and translated business needs into data models supporting underwriting workstation services.
- Conducted JAD sessions periodically with various stakeholders at various phases of the Software Development Life Cycle (SDLC) to discuss open issues and resolve them.
- Used forward engineering approach for designing and creating databases for OLAP model
- Worked on AWS Data Pipeline to configure data loads from S3 to into Redshift
- Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
- Worked on Performance Tuning of the database which includes indexes, optimizing SQLStatements.
- Involved in designing and deploying AWS Solutionsusing EC2, S3, RDS and Redshift.
- Designed and Developed Oracle database Tables, Views, Indexes with proper privileges and Maintained and updated the database by deleting and removing old
- Developed ETLs to pull data from various sources and transform it for reporting applications using PL/SQL
- Used Python scripts to update the content in database and manipulate files.
- Worked on scalable distributed data system using Hadoop ecosystem in AWS EMR.
- Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
- Performed data mining on Claims data using very complex SQL queries and discovered claims pattern.
- Used Teradata for OLTPsystems by generating models to support Revenue Management Applications that connect to SAS.
- Extracted the data from other data sources into HDFS using Sqoop
- Used data analysis techniques to validate business rules and identify low quality missing data in the existing Amgen enterprise data warehouse (EDW).
- Involved in Creating, modifying and executing DDL in table AWS Redshift and snowflaketables to load data
- Created Tableau and Power BI reports and dashboards on OLAP and relational db.
- Involved in Design and development of ETL processes using Informatica ETL tool for dimension and fact file creation.
- Created databases, tables, stored procedures, DDL/DML triggers, views, functions and cursors.
- Worked on data manipulation and analysis& accessed raw data in varied formats with different methods and analyzing and processing data.
- Developed AWS Lambda python functions using S3 triggers to automate workflows.
- Done Reverse engineering on existing data model to understand the data flow and business flows.
- Updated and manipulated content and files by using Python scripts.
- Implemented the Slowly changing dimension scheme (Type II & Type I) for most of the dimensions.
- Created S3 buckets also managing policies for S3 buckets and Utilized S3 bucket and Glacier for storage and backup on AWS.
- Worked on PL/SQL programming Stored Procedures, Functions, Packages and Triggers.
- Developed complex stored procedures using T-SQL to generate Ad hoc reports within SQL Server Reporting Service
- Created automated scripts for tableauworkbook refresh when not using live connections.
Environment: Erwin9.8, AWS, Python, Oracle, Sqoop, Teradata, OLAP, HDFS, SQL, Redshift, OLTP, PL/SQL, T-SQL, Tableau.
Confidential - San Jose, CA
Data Modeler
Responsibilities:
- Worked as a Data Modeler to generate Data Models using Power Designer and developed relational database system.
- Participated in design discussions and assured functional specifications are delivered in all phases of SDLC in an Agile Environment.
- Analyzed database requirements in detail with the project stakeholders by conducting Joint Requirements Development sessions
- Analyzed Data sources and requirements and business rules to perform logical and physical Data modeling.
- Written various data normalization jobs for new data ingested into Redshift
- Designed and developed star schema andsnowflake schema.
- Generated comprehensive analytical reports by running SQL queries against current databases to conduct data analysis.
- Designed and build the OLAP cubes for star schema and snowflake schema using native OLAP service manager.
- Developed complex Multi load and Fast Load scripts for loading data into Teradatatables from legacy systems.
- Involved in database development by creating Oracle PL/SQLFunctions, Procedures and Collections.
- Designed the schema, configured and deployed AWSRedshift for optimal storage and fast retrieval of data.
- Worked with T-SQL to create Tables, Views, Triggers and Stored Procedures.
- Reviewed SQL queries and edited inner, left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
- Created dimensional model for the reporting system by identifying required dimensions and facts using Power Designer.
- Enforced referential integrity in the OLTPdatamodel for consistent relationship between tables and efficient database design.
- Defined the key columns for the Dimension and Fact tables of both the Warehouse and DataMart
- Extensively analyzed Ralph-Kimball Methodology and implemented it successfully.
- Analyze complex data sets, performing ad-hoc analysis and data manipulation in Landing, Staging and Warehouse schemas using SQL.
- Created Complex Stored Procedures, User Defined Functions (UDFs) Views using T-SQL to evaluate result for the reports.
- Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
- Designed and developed Informaticamappings for data loads and data cleansing.
- Helped in migration and conversion of data from the Sybase database into Oracle database and preparing mapping documents.
- Written complex SQL queries for validating the data against different kinds of reports generated by Business Objects.
- Used Microsoft Excel tools like pivot tables, graphs, charts, solver to perform quantitative analysis.
Environment: Power Designer, Teradata, Oracle, SQL, PL/SQL, MS Excel, Redshift, OLAP, OLTP, SSIS, SSRS, Tableau, Informatica.
Confidential - Highland park, NJ
Data Analyst/Data Modeler
Responsibilities:
- Worked as a Data Analyst/Data Modeler involved in the entire life cycle of the project starting from requirements gathering to end of system integration.
- Gathered high level requirements and converted into business requirements.
- Worked on two sources to bring in required data needed for reporting for a project by writing SQL extracts
- Developed the logical data models and physical data models that capture current state/future state data elements and data flows using ER/ Studio.
- Conducted GAP analysis and data mapping to derive requirements for existing systems enhancements for a project.
- Worked with developers on the database design and schema optimization and other DB2 aspects.
- Developed, enhanced and maintained Snow Flakes and StarSchemas within data warehouse and data mart conceptual & logical data models.
- Used tools such as SAS/Access and SAS/SQL to create and extract oracle tables.
- Worked on extracting, transforming and loading data using SSIS Import/Export Wizard.
- Involved in extensive Data validation using SQL queries and back-end testing
- Worked on dimensional data modeling, dimensions and facts for OLAP cubes configured report server environment and deployed reports to the database.
- Created Data Dictionaries, Source to Target Mapping Documents and documented Transformation rules for all the fields.
- Created SQL tables with referential integrity and developed queries using SQL, and PL/SQL.
- Performed Normalization of the existing OLTP systems to speed up the DML statements execution time.
- Defined the Primary Keys PKs and Foreign Keys FKs for the Entities, created dimensionsmodelstar and snowflakeschemas using Kimballmethodology
- WrittenT-SQL statements for retrieval of data and Involved in performance tuning of T-SQL queries and Stored Procedures.
- Used data profiling tools and techniques to ensure data quality for data requirements.
- Involved in creating Cubes and Tabular models using SQL Analysis Services (SSAS).
- Applied data naming standards, created the data dictionary and documented data model translation decisions and also maintained DWmetadata.
- Involved working in POC to various team for their Adhoc analysis of data with Tableau.
Environment: ER/Studio, DB2, Oracle, SQL, PL/SQL, OLAP, T-SQL, Tableau, SSAS, SSIS, SSRS.
Confidential - Iowa City, IA
Data Analyst
Responsibilities:
- Performed data analysis and data profiling using complex SQL on various sources systems and answered complex business questions by providing data to business users.
- Extensively used ETL methodology for supporting data extraction, transformations and loading processing, in a complexEDW using Informatica.
- Created SSIS package to extract data from OLTP to OLAP and scheduled jobs to call the packages and stored procedures.
- Participated in UAT sessions to educate the business users about the reports, dashboards and the BI System.
- Developed SQL and PL/ SQL scripts for migration of data between databases.
- Involved in Analysis of functional and non-functional categorized data elements for data profiling and mapping from source to target data environment.
- Used the automated process for uploading data in production tables by using UNIX.
- Performed data reconciliation between integrated systems.
- Documented data dictionaries and business requirements for key workflows and process points
- Performed Backend Testing by executing SQL queries.
- Generated customized reports using SAS/MACRO facility, PROC REPORT, PROC TABULATE and PROC SQL.
- Involved with data cleansing/scrubbing and validation.
- Worked with T-SQL to create Tables, Views, Triggers and Stored Procedures.
- Effectively used data blending feature in tableau.
- Extensively worked with Microsoft Business Intelligence Services such as SSIS and SSRS.
- Involved in data mining, transformation and loading from the source systems to the target system.
- Generated multiple enterprise reports like SSRS and Crystal reports.
Environment: SQL, Informatica, BI, PL/ SQL, Tableau, SAS, OLTP, UNIX, T-SQL, SSIS,OLAP, SSRS,.