- Over 8+ Years of professional experience in Data Modeling and Data Analysis design of OLTP and OLAP systems.
- Experienced in Designing Star Schema (identification of facts, measures and dimensions), Snowflake schema for Data Warehouse, ODS architecture by using tools like Erwin Data Modeler, Power Designer, Embarcadero E - R Studio and Microsoft Visio.
- Having good knowledge and experience in Normalization and De-Normalization techniques for optimum performance in relational and dimensional Database environments.
- Strong experienced working with conceptual, Logical and physical Data Modeling considering Meta Data standards and experienced in designing a canonical model (relational model) and very good knowledge in Ralph Kimball and Bill Inmon approaches.
- Experience used ERWIN entity-attribute editor to build primary keys, foreign keys, alternate keys and inversion entries, and ERWIN relationship definition editor and subtype relationship editor to assist me in designing identifying, non-identifying, recursive and subtype relationships for logical design.
- Experiencedin Extracting Data from Mainframes Flat File, Oracle, Netezza and converting them into Teradata tables using SAS PROC IMPORT, PROC SQL etc.
- Expert in Extraction, Transforming and Loading (ETL) data flows using SSIS; creating mappings/workflows to extract data from Oracle, SQL Server and Flat File sources and load into various Business Entities.
- Extensive experienced analyzing and documenting Business Requirements and system functional specifications including use cases; facilitated and participated in Joint Application Development (JAD) sessions, white board sessions to resolve the revolving issues and facilitated the coordination between the teams.
- Expert in building Enterprise Data Warehouse or Data warehouse appliances from Scratch using both Kimball and Inmon Approach.
- Extensive experience in languages like SQL, Python and PL/SQL and extensive experience in Relational and DimensionalDatamodeling for creating Logical and Physical Design of Database and ER Diagrams using multipledatamodeling tools like Erwin, ER Studio.
- Experienced in working with Teradata Utilities like Fast load, Multi load, Tpump and Fast Export Teradata Query submitting and processing tools like Bteq and Teradata SQL Assistant.
- Experienced in creating and using Stored Procedures, Triggers, Views, User Defined Functions, Sub - Queries and Joins for complex queries involving multiple tables and exception handlers and proficient in SQL Server and T- SQL (DDL and DML) in constructing Tables, Normalization/ De normalization Techniques on database Tables.
- Experienced in analysis, modeling, design, and development of Tableau reports and dashboards for analytics and data visualization.
- Experienced inAutomating and Scheduling the Teradata SQL Scripts in UNIX using Korn Shell scripting and expertise in UML (class diagrams, object diagrams, use case diagrams, state diagrams, sequence diagrams, activity diagrams, and collaboration diagrams) as a Business Analysis methodology for application functionality designs using Rational Rose and MS-Visio.
- Expertise with various RDBMS and No-SQL databases like Oracle, SQL Server, DB2 UDB, and Teradata, Netezza and MongoDB, Cassandra.
- Expertise in implementing Slowly Changing Dimensions - Type I, II&III in Dimension tables, Conformed Dimensions, Role Playing Dimensions, and Degenerate Dimensions as per the Requirements.
- Experienced in extracting, transforming and loading (ETL) Data from spreadsheets, Database tables and other sources using SQL Server Integration Services packages (SSIS) and Informatica.
- Experienced in Performance tuning on oracle databases by leveraging explain plans, and tuning SQL queries and excellent experience in writing, Complex SQL queries to validate data movement between different layers in data warehouse environment.
- Expert in Building reports using SQL SERVER Reporting Services (SSRS … Crystal Reports, Power BI and Business Objects.
- Experienced in MDM (Master Data management) in removing duplicates, standardizing data, and to eliminate incorrect data.
- Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.
Data Modeling Tools: Erwin r9.6/r9.5/9.1, ER Studio and Oracle Toad Modeler, Power Designer.
ETL Tools: Microsoft SSIS and Informatica.
Programming Languages: SQL, T-SQL, XML, PL/SQL and Python.
Database Tools: Microsoft SQL Server … Teradata, Netezza, Oracle 12c/11g/10gi, MS Access, Cassandra, DynamoDB.
Reporting and Visualization Tools: Power BI, Tableau and SSRS.
Confidential, Piscataway, NJ
Sr. Data Analyst
- Utilizing advanced statistical methodologies and machine learning models like linear, multiple & logistic regression, decision trees, Bayesian inference to solve complex business problems.
- Performed Data profiling, preliminary data analysis and handle anomalies such as missing, duplicates, outliers, and imputed irrelevant data.
- Work extensively with advanced analytical action utilizing various filters, sets, groups, LOD, parameters, hierarchy, calculated fields, table calculations on Tableau reports and workbooks
- Strong hands on experience with creating dashboards, data visualizations and analytics using Tableau Desktop
- Handling large sets of data using Python and Tableau.
- Analyzing large data sets and conducting regression analysis and understanding Gaussian distribution
- Methodically writing complex queries for joining, filtering and analyzing data using SQL Server 2014
- Extensive experience in creating, maintaining and tuning views, user defined functions and ssystem functions using SQL Server
- Experience in working with live data, structured and unstructured data from multiple data sources and translate complex data into user-friendly versions
- Integrated data from various Data sources like Oracle, MSSQL Server, IBMDB2, Teradata using Informatica to perform Extraction, Transformation, loading (ETL processes)
- Worked and extracted data from various database sources like Oracle, SQL Server
- Loaded Flat File Data into Designed and built Star and Snowflake dimensional models creating facts, dimensions, measures, cube and established data granularity
- Identifying and evaluating Key Performance Indicators (KPI’s)
- Establishing and coordinating the assembly of records necessary for quality control tracking, including charts, statistical analysis and reports
- Managing project activities and resources and forecasts through clarity allocations using Gantt chart
- Developing AWS cloud formation templates to create custom sized VPC, subnets, EC2 instances, ELB, security groups
- Creating a high availability and scalable AWS stacks using EC2 auto scaling functionality
- Maintaining a farm of EC2 instances, ELB's and RDS
- Worked on Cloud automation using AWS Cloud Formation templates
- Responsible for creating new IAM users & groups in addition to defining roles and policies in the directory server
- Responsible for hosting static content on AWS S3 and having them updated on content delivery network
- Coordinated with Business Analyst and designed logical and physical data models as per the requirements.
Environment: Tableau Desktop 10.x,9.x, SQL Server 2014, Python 3.x, R, SSAS, AWS, ETL, Oracle, Informatica, Teradata, MS Excel, Origin, Agile Scrum.
Confidential, Chicago IL
Sr. Data Analyst
- Involved in sessions with business, project manager, Business Analyst and other key people to understand the Business Needs and propose a solution from a Warehouse standpoint.
- Standard in designing the architecture of the organization at the enterprise level and thorough understanding and hands on experience of Software Development life cycle from Ideation to Implementation.
- Responsible for the design, development and administration of analytical data processes such as converting TERADATA SQL scripts to REDSHIFT PSQL scripts, data validation checks, metadata review, included within those responsibilities are the areas of data access and delivery technologies.
- Designed the ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle and Teradata as per business requirements using ER Studio.
- Worked on Data Lake in AWS S3, Copy Data to Redshift, Custom SQL’s to implement business Logic using UNIX and Python Script Orchestration for Analytics Solutions.
- Implemented metadata standards, data governance and stewardship, master data management, ETL, ODS, data warehouse, data marts, reporting, dashboard, analytics, segmentation, and predictive modeling.
- Conducted live JAR sessions to Elicit, Analyze and Document Business Requirements from the Business, conducted live JAR&JAD sessions with SMEs, and all the placeholders to develop High Level Requirement, and then formalized those Requirements to Detail Level Requirements.
- Worked with SME's and other stakeholders to determine the Requirements to identify Entities and Attributes to build Conceptual, Logical and Physical Data Models and designed an Industry standard Data Model specific to the company with group insurance offerings, translated the Business Requirements into detailed production level using Workflow Diagrams, Sequence Diagrams, Activity Diagrams and Use Case Modeling.
- Responsible forAnalyzing report requirements and developing the reports by writing Teradata SQL Queries and using MS Excel, Power Point and UNIX.
- Designed Database Maintenance Planner for the Performance of SQL Server, which covers Database Integrity Checks, Update Database Statistics and Re-indexing.
- Created a high-level industry standard, generalized Data Model to convert it into Logical and Physical Model at later stages of the project using ER-Studio and involved in creation of UML Diagrams including Context, Business Rules Flow, and Class Diagrams.
- Developed Stored Procedures, Functions & Packages to implement the logic at the server end on Oracle and Performed Application/ SQL Tuning using Explain Plan, SQL Tracing & TKPROF. Also, used Materialized View for the Reporting Requirement.
- Developed numerousTeradata SQL Queries by creating SET or MULTISET Tables, Views, Volatile Tables, using Inner and Outer Joins, Using Date Function, String Function and Advanced techniques like RANK and ROW NUMBER functions.
- Data sources are extracted, transformed and loaded to generate CSV data files with Python programming and SQL queries.
- Involved in extensive DATA validation using SQL queries and back-end testing and generated DDL statements for the creation of new ER/studio objects like table, views, indexes, packages and stored procedures.
- Conducted design walk through sessions with Business Intelligence team to ensure that Reporting Requirements are met for the Business and developed Data Mapping, Data Governance, and Transformation and Cleansing Rules for the Master Data Management Architecture involving OLTP, ODS.
- Collaborated with ETL, BI and DBA teams while working on, SQL Server, Teradata to Analyze and provide solutions to Data issues and other challenges while implementing the OLAP model.
- Performed dimensional Modeling on OLAP system using Ralph Kimball methodologies and Extracted data from Oracle 11g and upload to Teradata tables using Teradata utilities FASTLOAD & Multiload.
- Generated ad-hoc reports using Crystal Reports 9 and Prepared Analytical and Status Reports and Updated the project plan as required and worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic and create and deploy reports using SSRS.
- Utilized Power Query in Power BI to Pivot and Un-pivot the data model for data cleansing and data massaging.
Environment: ER-Studio, Python, Teradata, Netezza, Oracle 12c, Microsoft Office SharePoint, AWS S3, Redshift, Cognos, MS Office (Word, Excel and PowerPoint), SQL Server, MS Project, MS FrontPage, Python, MS Access, EDI, UML, MS Visio, Oracle Designer, SQL Server 2012, Oracle SQL developer 2012, Crystal Reports, SSRS, SSIS and Tableau.
Confidential - Kennett Square, PA.
- Engaged with Business users Involved in gathering the related Business Requirements and documented the performance of the system by understanding the existing Data Model.
- Mapped the Business Requirements and New Databases to the Logical Data Model, which defines the project delivery needs.
- GeneratedDDL statements and scripts from Physical Data Model to create objects like table, views, indexes, stored procedures and packages.
- Designed Fact tables, Dimension and aggregate tables for Data warehouse and worked on the conversion process of Data, which is stored in flat files into Oracle tables.
- Provided technical guidance for re-engineering functions of Teradata warehouse operations into Netezza and used variousTeradataIndex techniques to improve the query performance.
- Implemented naming Standards and Warehouse Metadata for fact and dimension of Logical and Physical Data Model.
- Developed Star Schema and Snowflake Schema in designing the Logical Model into Dimensional Model.
- Translated the Logical Data Model into Physical Data Model with Forward Engineering in Erwin tool.
- Used Reverse Engineering and Forward Engineering techniques on Databases from DDL scripts and created tables and Models in Data Mart, Data Warehouse and Staging.
- Worked on slowly changing dimension tables and hierarchies in dimensions and worked on Data Mapping process from source system to target system.
- Reverse Engineered DB2 databases and forward engineered them to SQL Server 2000 using ER-Studio.
- De-normalized the Database to put them into the Star Schema of the Data Warehouse and created documentation and test cases, worked with users for new module enhancements and testing.
- Understood existing Data Model and documented suspected design affecting the performance of the system.
- End-to-end grasp of the entire Software Development life cycle (SDLC) including Analysis, Design, Development and testing of Software applications.
Environment: ER-Studio, Oracle 10g, SQL Server, UML, Crystal Reports, Windows 7, MS Office, Teradata, Netezza, SQL, PL/SQL, T-SQL, Informatica, SSRS, SAS, SSIS, XML, Excel, Pivot Tables, Metadata, Cogon’s.
Confidential, Grand Rapids, MI
SQL BI Developer
- Majorly participated in data migration. Migrated data from existing system to a new system by creating SSIS packages.
- Responsible for Developing, Monitoring and Deploying SSIS packages and extracting the data using SSIS from OLTP to OLAP.
- Utilized Power Query in Power BI to pivot and un-pivot the data model for data cleansing.
- Involved in Power BI Administration practices like scheduling the refresh, creating workspaces for each environment, creating Apps for each workspace, granting permissions to the reports in the organization.
- Actively participated in interacting with the DBA’s and technical manager to know about the Business requirements, understand the function workflow of information, and meet their needs.
- Worked with Dynamics CRM, exacted data and manipulated data in CRM using SSIS packages.
- Created Visual Charts, Graphs, Maps, Area Maps, Dashboards and Storytelling using Tableau and Power BI.
- Developed and published daily trend analysis on retail metrics especially relating to sales, revenue, customer retention and business development team’s productivity.
- Developed Power BI reports using query logic based on business rules, and visuals selected by united stakeholders.
- Designed C# forms and web pages to display SSRS and crystal reports for business reporting purpose.
- Extracted data from Oracle database and SQL server database for designing SSIS packages in Visual Studio.
- Designed, developed, and deployed reports in MS SQL Server environment using SSRS.
- Exacted data from vendor’s system to SQL Server via web service calls and extracted data from web portal to analyze the data.
- Configured several security policies on various entities in the system to manage and implement appropriate access controls in MS Dynamics CRM.