Sr. Data Modeler/data Analyst Resume
South Portland, ME
SUMMARY:
- Over 9+ years of professional experience in Data Modeling and Data Analysis as a Proficient in gathering business requirements and handling requirements management.
- Good in system analysis, ER Dimensional Modeling, Database design and implementing RDBMS specific features.
- Having good experience with Normalization (1NF, 2NF and 3NF) and De - normalization techniques for improved database performance in OLTP, OLAP, Data Warehouse and Data Mart environments.
- Solid hands on experience with administration of data model repository, documentation in metadata portals in such as Erwin, ER Studio and Power Designer tools.
- Hands on experience with Kimball’s and Inmon are dimensional modeling methodology.
- Great hands on knowledge using Microsoft office like MS Word, MS Excel and MS PowerPoint.
- Having 3+ Years of Tableau experience (Tableau Desktop and Tableau Server) Analysis, Design, Development and Maintenance phases providing end to end solutions on Data Warehousing projects.
- Involved in Full Life Cycle Development of Reporting Projects, includes Requirements Gathering/ analysis, Design, Development, Testing and Production rollover.
- Involved in Trouble Shooting, Performance tuning of reports and resolving issues with in Tableau Server and Reports.
- Experienced in designing customized interactive dashboards in Tableau using Marks, Action, filters, parameter, calculations and Relationships.
- Involved in Trouble Shooting, Performance tuning of reports and resolving issues with in Tableau Server and Reports.
- Having knowledge on gather business requirements and translate them into reporting needs.
- Recommending Dashboards per Tableau visualization features and delivering reports to Business team on timely manner.
- Successfully upgraded Tableau platforms in clustered environment and performed content Upgrades.
- Expertise in creating conceptual, logical and physical data models for OLTP and OLAP systems using dimensional models and star schema for data warehouse and operational systems.
- Strong working Experience with Agile, Scrum, Kanban and Waterfall methodologies.
- Experience in Informatica Power Center Administration and Informatica server installation, repository configuration and debugging.
- Strong experience in Data Migration, Data Cleansing, Transformation, Integration, Data Import, and Data Export.
- Extensively worked with data warehousing, Business Intelligence, ETL methodologies, technologies, using Informatica
- Expertise in performing User Acceptance Testing (UAT) and conducting end user sessions.
- Proficient in data governance, data quality, data stewardship, metadata management, master data management.
- Strong background in Database development and designing of data models for different domains.
- Experience in data analysis to track data quality and for detecting/correcting in accurate data from the databases.
- Extensive experience working with XML, Schema Designing and XML data.
- Responsible for data governance rules and standards to maintain the consistency of the business element names in the different data layers.
- Conduct data analysis, mapping transformation, data modeling and data-warehouse concepts.
- Excellent knowledge in creating Databases, Tables, Stored Procedures, DDL/DML Triggers, Views, functions and Indexes using T-SQL.
- Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
- Performs platform capacity planning and management
- Provide all levels of technical and application support; incident management (Level 1, 2 and 3); address user, application and data issues
- Execute change management processes surrounding new releases of SAS functionality
- The Senior Risk Quant SAS Developer will design and develop robust automated solutions in partnership with stakeholders in Risk and Finance
- Process analysis and process improvement, Create and maintain technical documentation
- Analyze, design and test break/fix requests and enhancements, to SAS Solutions (SAS Visual Analytics, SAS Financial Management, SAS EGRC, SAS CRMB, SAS RMFB)
- Supporting SAS developers with coding issues
- Good understanding of Access queries, V-Lookup, formulas, Pivot Tables, etc.
- Excellent knowledge in Data Analysis, Data Validation, Data Cleansing, Data Verification and identifying data mismatch.
- Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
- Extensive knowledge and experience in producing tables, reports, graphs and listings using various procedures and handling large databases to perform complex data manipulations.
- Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects.
- Excellent in creating various artifacts for projects which include specification documents, data mapping and data analysis documents.
- Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
- Excellent knowledge on creating reports on SAP Business Objects, Web reports for multiple data providers.
- Excellent experience in writing and executing unit, system, integration and UAT scripts in a data warehouse projects.
- Experience in supporting end-user reporting needs using Informatica.
TECHNICAL SKILLS:
Major Skills: Erwin, Agile, NoSQL, Oracle, AWS, Redshift, Hive, PL/SQL, XML, SQL Server, OLAP, OLTP, T-SQL, SSIS, Hbase, Cassandra, MS Access, SQL queries, SQL Joins, MySQL, Microsoft Excel, SQL scripts.
Modeling Tools: Erwin r9.7, Sybase Power Designer 16.6, Oracle Designer, ER/Studio.
Database Tools: Microsoft SQL Server 2017, Teradata 15, Oracle 12c, MS Access 2016, Poster SQL, Netezza.
OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports.
ETL Tools: SSIS, Pentaho, Informatica Power Center, SAP Business Objects XIR3.1/XIR2, Web Intelligence, SSRS.
Operating System: Windows 10/8, Dos, Unix.
Reporting Tools: Business Objects, Crystal Reports SAP SE, SAP Business Intelligence, Micro Strategy.
Web technologies: HTML 5, DHTML, XML.
Tools: & Software's Toad, MS Office 2016, BTEQ, Teradata SQL Assistant.
Big Data Technologies: Hadoop 3.0, HDFS, Hive 2.3, Pig 0.17, HBase, Sqoop 1.8, Flume.
Amazon Redshift: AWS, EC2, S3, SQS.
Other tools: SQL*Plus, SQL*Loader, MS Project, MS Visio 2016 and MS Office 2016.
WORK EXPERIENCE:
Confidential, South Portland, ME
Sr. Data Modeler/Data Analyst
Responsibilities:
- Worked as a Sr. Data Modeler / Analyst to generate Data Models using Erwin and developed relational database system.
- Worked on Software Development Life Cycle (SDLC) with good working knowledge of testing, agile methodology, disciplines, tasks, resources and scheduling.
- Worked in Agile Environment using tools like JIRA and Version One.
- Conducted JAD sessions with management, vendors, users and other stakeholders for open and pending issues to develop specifications.
- Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
- Created tables, views, indexes, Partitions and generated SQL scripts using Erwin.
- Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
- Created Informatica mappings with PL/SQL Procedures/Functions to build business rules to load data.
- Developed PL/SQL Procedures, Functions, Cursors, Packages, Views and Materialized Views
- Used SQL Loader, external tables and import/export toolbar to load data into Oracle.
- Developed complex mapping to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, and Applications.
- Performed Data Analysis, Data Migration and data profiling using complex SQL on various sources systems including SQL Server.
- Developed Informatica mappings, enabling the extract, transport and loading of the data into target tables.
- Developed logical data models and physical database design and generated database schemas using Erwin.
- Performs platform capacity planning and management
- Provide all levels of technical and application support; incident management (Level 1, 2 and 3); address user, application and data issues
- Execute change management processes surrounding new releases of SAS functionality
- The Senior Risk Quant SAS Developer will design and develop robust automated solutions in partnership with stakeholders in Risk and Finance
- Process analysis and process improvement, Create and maintain technical documentation
- Analyze, design and test break/fix requests and enhancements, to SAS Solutions (SAS Visual Analytics, SAS Financial Management, SAS EGRC, SAS CRMB, SAS RMFB)
- Supporting SAS developers with coding issues
- Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
- Worked with DBA group to create Best-Fit Physical Data Model from the Logical Data Model using Forward Engineering.
- Reverse Engineered NoSQL databases and then forward engineered them to Oracle using Erwin.
- Created data models for AWS Redshift and Hive from dimensional data models.
- Worked with the Business Analyst, QA team in their testing and DBA for requirements gathering, business analysis, testing and project coordination.
- Worked on ETL design and development, creation of the Informatica source to target mappings, sessions and workflows to implement the Business Logic.
- Involved in data analysis and modeling for the OLAP and OLTP environment.
- Wrote T-SQL statements for retrieval of data and Involved in performance tuning of T-SQL queries and Stored Procedures.
- Used SQL Server Reporting Services (SSRS) for database reporting in Oracle.
- Created SSIS packages to export data from text file to SQL Server Database.
- Generated XMLs from the Erwin to be loaded into MDR (metadata repository)
- Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business.
- Wrote SQL scripts for creating tables, Sequences, Triggers, views and materialized views.
- Designed Data Flow Diagrams, E/R Diagrams and enforced all referential integrity constraints.
- Developed and maintains data models and data dictionaries, data maps and other artifacts across the organization, including the conceptual and physical models, as well as metadata repository.
- Performed extensive Data Validation, Data Verification against Data Warehouse and performed debugging of the SQL-Statements and stored procedures.
- Designed and implemented basic SQL queries for testing and report/data validation.
- Ensured the compliance of the extracts to the Data Quality Center initiatives.
- Used SQL, PL/SQL to validate the Data going in to the Data warehouse.
- Designed reports in Access, Excel using advanced functions not limited to pivot tables, formulas.
- Gathered and documented the Audit trail and traceability of extracted information for data quality.
Environment: Erwin 9.7, Agile, NoSQL, Oracle 12c, AWS, Redshift, Hive 2.3, PL/SQL, XML, SQL Server, OLAP, OLTP, T-SQL, SSIS, HBase, Cassandra 3.11, MS Access 2016, MS Excel 2016
Confidential, Tarrytown, NY
Data Modeler/Data Analyst
Responsibilities:
- As a Sr. Data Modeler / Data Analyst I was responsible for all data related aspects of a project.
- Extensively used Agile methodology as the Organization Standard to implement the data models.
- Translated business and data requirements into all data related aspects of a project.
- Ensured data design follows the prescribed architecture and framework.
- Developed PL/SQL procedures/packages to load the data into Oracle. Tuning Informatica Mappings and Sessions for optimum performance.
- Conducted data modeling JAD sessions and communicated data related standards.
- Developed Conceptual, Logical and Physical data models for central model consolidation.
- Involved in the implementation of Data Governance Standards including Data Lineage, Data Provisioning, and Data Modeling.
- Extensively used ETL and Informatica to load data from MS SQL Server, Excel spreadsheet, flat files into the target Oracle database.
- Worked on Master data Management (MDM) Hub and interacted with multiple stakeholders.
- Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
- Developed reports using complex formulas and to query the database to generate different types of ad-hoc reports using SSRS.
- Involved in the validation of the OLAP Unit testing and System Testing of the OLAP Report Functionality and data displayed in the reports.
- Resolved data type inconsistencies between the Source systems and Data Warehouse.
- Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
- Configured Hadoop Ecosystems to read data transaction from HDFS and Hive.
- Performed Data Profiling to identify data issues upfront, provided SQL prototypes to confirm the business logic provided prior to the development.
- Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
- Used reverse engineering to connect to existing database and create graphical representation (E-R diagram).
- Assisted data warehouse project team in extracting business rules.
- Created an enterprise data dictionary and maintained standards documentation.
- Analysis of data requirements and translation of these into dimensional data models.
- Developed and deployed Data Quality T-SQL codes, stored procedures, views, functions, triggers and jobs.
- Analyzed source data, extracted, transformed, and loaded data in to target data warehouse based on the requirement specification using Informatica Power center.
- Developed SQL Stored procedures to query dimension and fact tables in data warehouse.
- Proficiency in SQL across a number of dialects (we commonly write PostgreSQL, Redshift, SQL Server, and Oracle).
- Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
- Gathered and analyzed existing physical data models for in scope applications and proposed the changes to the data models according to the requirements.
- Designed and documented Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object Oriented Design) using UML and Visio.
- Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
- Performed data cleansing and data manipulation activities using NOSQL utility.
- Tested whether the reports developed in Business Objects
Environment: Agile, Erwin 9.6, MDM, SSRS, OLAP, Hive 2.3, HDFS, Hadoop 3.0, SQL, T-SQL, MS Visio 2016, NOSQL, XML
Confidential, Charlotte, NC
Data Analyst/Data Modeler
Responsibilities:
- Worked as a Data Analyst/Data Modeler to generate Data Models using Erwin and subsequent deployment to Enterprise Data Warehouse.
- Analyzed the business requirements of the project by studying the Business Requirement Specification document.
- Extensively worked on Data Modeling tools Erwin Data Modeler to design the data models.
- Worked on Amazon Redshift and AWS and architecting a solution to load data, create data models.
- Produced 3NF data models for OLTP designs using data modeling best practices and modeling skills.
- Involved in Normalization / De normalization techniques for optimum performance in relational and dimensional database environments.
- Optimized and updated UML Models (Visio) and Relational Data Models for various applications.
- Responsible for technical Data governance, enterprise wide Data modeling and Database design.
- Designed tables and implemented the naming conventions for Logical and Physical Data Models in Erwin.
- Coordinated with DBA's and generated SQL code from the data models and generated DDL scripts using forward engineering in Erwin.
- Gathered requirements and created ER (entity-relationship) diagrams as part of requirements analysis using Erwin.
- Developed data models and data migration strategies utilizing sound concepts of data modeling including star schema, snowflake schema.
- Developed the data dictionary for various projects for the standard data definitions related data analytics.
- Created database objects like Procedures, Functions, Packages, Triggers, Indexes and Views using T-SQL.
- Established a business analysis methodology around the RUP (Rational Unified Process). Developed use cases, project plans and manage scope.
- Worked on analyzing source systems and their connectivity, discovery, data profiling and data mapping.
- Developed and implemented data cleansing, data security, data profiling and data monitoring processes.
- Involved in SQL Server and T-SQL in constructing Tables, Normalization and De-normalization techniques on database Tables.
- Worked on the reporting requirements and involved in generating the reports for the Data Model.
Environment: Erwin 9.2, AWS, MS Visio 2014, SQL, T-SQL
Confidential, Chicago, IL
Sr. Data Analyst/Data Modeler
Responsibilities:
- Worked as a Data Analysts/Data Modeler to understand Business logic and User Requirements.
- Closely worked with cross functional Data warehouse members to import data into SQL Server and connected to SQL Server to prepare spreadsheets.
- Created dimensional data model based on star schema using Kimball's methodology.
- Created reports for the Data Analysis using SQL Server Reporting Services.
- Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.
- Created Use Case Diagrams, Activity Diagrams, Sequence Diagrams in Rational Rose.
- Created V-Look Up functions in MS Excel for searching data in large spreadsheets.
- Created logical Data model from the conceptual model and its conversion into the physical database design using ER/Studio.
- Designed and implemented basic PL/SQL queries for testing and report/data validation.
- Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ER/Studio.
- Performed data analysis and data profiling using SQL Data Explorer on various sources systems.
- Identified source databases and created the dimensional tables and checked for data quality using complex SQL queries.
- Responsible for data lineage, maintaining data dictionary, naming standards and data quality.
- Created and modified several database objects such as Tables, Views, Indexes, Constraints, Stored procedures, Packages, Functions and Triggers using SQL and PL/SQL.
- Wrote Python scripts to parse XML documents and load the data in database.
- Developed optimized stored procedures, T-SQL queries, User Defined Functions (UDF), Cursors, Views and Triggers, SQL Joins and other statements for reporting.
- Designed the data marts in dimensional data modeling using star and snowflake schemas.
- Performed Data Analysis for building the reports and building Enterprise Data warehouse (EDW).
- Performed Data Analysis and Data Profiling and worked on data transformations and data quality rules.
- Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
- Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
- Collected, analyze and interpret complex data for reporting and/or performance trend analysis
Environment: XML, ER/Studio v14, MS Excel 2012, Python, T-SQL, Rational Rose, PL/SQL
Confidential
Data Analyst
Responsibilities:
- Analyzed daily of the data reporting operations. Identified and developed insight to reduce data quality issues and process gaps.
- Responsible to the stakeholders for timely delivery & completion of assigned projects.
- Identified and interpreted patterns and trends assess data quality and eliminate irrelevant data.
- Developed the stored Procedures, SQL Joins, SQL queries for data retrieval, accessed for analysis.
- Involved in extensive Data validation using SQL queries and back-end testing
- Designed automated reports through MySQL and Excel to reduce manual work.
- Used Microsoft Excel tools like pivot tables, graphs, charts, solver to perform quantitative analysis.
- Created complex SQL queries to fetch data per the software needs and created Dashboards visualizing
- Wrote SQL queries using triggers, Stored Procedure and used multi-table joins for data retrieval.
- Gathered requirements and performed data mapping to understand the key information by creating tables
- Built and maintained SQL scripts, indexes, and complex queries for data analysis and extraction.
- Analyzed the source data coming from various data sources like Mainframe & MySQL.
- Involved with Transact SQL (T-SQL) Coding, writing queries, cursors, functions, views, & triggers.
- Analyzed and gather user requirements and create necessary documentation of their data migration
- Developed SQL Joins, SQL queries, tuned SQL, views, test tables, scripts in development environment.
- Used SQL*Loader to load data from external system and developed PL/SQL programs to dump the data from staging tables into base tables.
- Extensively wrote SQL Queries (Sub queries, correlated sub queries and Join conditions) for Data Accuracy, Data Analysis and Data Extraction needs.
- Evaluated data and resolve data quality issues by analyzing gaps in process and source files.
- Evaluated data mining request requirements and help develop the queries for the requests.
- Performed Data Analysis and Data manipulation of source data from SQL Server and other data structures to support the business organization.
Environment: SQL queries, SQL Joins, MySQL, Microsoft Excel 2012, SQL scripts, PL/SQL, SQL Server, T-SQL