We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

Philadelphia, PA

SUMMARY:

  • Above 8+ years of experience in Data Analyst and Data Modeling, Data Development, Implementation and Maintenance of databases and software applications.
  • Good understanding of Software Development Life cycle (SDLC) including planning, analysis, design, development, testing, implementation.
  • Extensive experience in using ER modeling tools such as Erwin and ER/Studio, Power Designer and MDM
  • Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirements gathering and analysis.
  • Excellent proficiency in Agile/Scrum and waterfall methodologies.
  • Good experience in working with different reporting tool environments like SQL Server Reporting Services (SSRS), Cognos and Business Objects.
  • Experience in developing Conceptual, logical models and physical database design for Online Transactional processing (OLTP) and Online Analytical Processing (OLAP) systems.
  • Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL for implementing the business techniques.
  • Experience in generating DDL (Data Definition Language) Scripts and creating Indexing strategies.
  • Proficient in Normalization/De - normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
  • Strong experience in Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration management.
  • Excellent experience in writing and executing unit, system, integration and UAT scripts.
  • Experience in data analysis to track data quality and for detecting/correcting inaccurate data from the databases.
  • Knowledge and working experience on AWS tools like Lake, Amazon S3, and Amazon Red Shift.
  • Experienced in Technical consulting and end-to-end delivery with data modeling, data governance and design - development - implementation of solutions.
  • Strong experience working with conceptual, logical and physical data modeling considering Metadata standards.
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Data Profiling, Data Mapping, Performance Tuning and System Testing.
  • Excellent experience in writing SQL queries to validate data movement between different layers in data warehouse environment.
  • Good understanding of Ralph Kimball (Dimensional) & Bill Inman (Relational) model Methodologies.
  • Experience with Teradata utilities such as Fast Export, MLOAD for handling various tasks.
  • Having good working experience in Data Vault which is used in maintain Historical Data in the Enterprise Data Warehouse.
  • Experience in Requirement gathering, System analysis, handling business and technical issues & communicating with both business and technical users.
  • A good familiarity and experience in the work environment consisting of Business analysts, Production/ Support teams, Subject Matter Experts, Database Administrators and Database developers.
  • Experience with DBA tasks involving database creation, performance tuning, creation of indexes, creating and modifying table spaces for optimization purposes.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Supporting Ad-hoc business requests and Developed Stored Procedures and Triggers and extensively used Quest tools like TOAD.

TECHNICAL SKILLS:

Data Modeling Tools: ER/Studio V17, Erwin 9.8/9.7, Power Designer 16.6.

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Cloud Management: Amazon Web Services(AWS), Red shift

Programming Languages: SQL, PL/SQL, UNIX shell Scripting, PERL, AWK, SED

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Testing and defect tracking Tools: Quality Center, Win Runner, MS Visio 2019

Operating System: Windows, Unix, Sun Solaris

ETL/Reporting Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Tableau.

Methodologies: RAD, JAD, RUP, UML, SDLC, Agile, Waterfall Model.

PROFESSIONAL EXPERIENCE:

Confidential - Philadelphia, PA

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Worked as a Sr. Data Modeler/Analyst to generate Data Models using Erwin and developed relational database system.
  • Conducted and participated JAD sessions with the Business Analysis Team, Finance and development teams to gather, analyze and document the Business and reporting requirements.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Worked with Azure Machine Learning, Azure Event Hubs, Azure Stream Analytics, Pivot Tables.
  • Created 3NF business area data modeling with de-normalized physical implementation data and information requirements analysis using Erwin tool.
  • Developed T-SQL scripts to create databases, database objects and to perform DML and DDL tasks.
  • Developed detailed ER diagram and data flow diagram using modeling tools following the SDLC structure
  • Used unified modeling language (UML) to create artifacts for the project related deliverables.
  • Analyzed and worked on developing design prototype for Dimension Tables and Fact Tables based on the warehouse design along with the development team
  • Performed reverse engineering using Erwin to redefine entities, attributes and relationships existing database.
  • Performed logical data modeling, physical Data modeling (including reverse engineering) using the ERWIN Data Modeling tool.
  • Worked with MDM systems team with respect to technical aspects and generating reports.
  • Created the Data Mapping document to specify, the location of the table in database, conversion rules and business rules.
  • Designed Azure for package deployment, schedule jobs and ETL Framework control.
  • Generated various reports using SQL Server Integration Services (SSIS) and SQL Server Report Services (SSRS) for business analysts and the management team.
  • Wrote requirements for designing data marts that were used as the source for the various systems in the company.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Performed Data Analysis by writing SQL Queries for Testing and Troubleshooting against Data Warehouse.
  • Implement Azure feature pack to ETL from on-premise to BLOB Storage, HD Insight and SQL-Azure.
  • Designated Snowflake schemas with a very good understanding of FACT and dimensional tables for Data Warehouse by using Erwin.
  • Extracted data from different sources like Oracle and text files using SAS/Access, SAS SQL procedures and created SAS datasets.
  • Documented test cases, test plans and test results for the loading processes as well as data validation.
  • Maintained and implemented Data Models for Enterprise Data Warehouse using Erwin.
  • Involved in development and implementation of SSIS, SSRS and SSAS application solutions for various business units across the organization.
  • Worked with project management, business teams and departments to assess and refine requirements to design BI solutions using MS Azure.
  • Involved in migration projects to migrate data from data warehouses on Oracle and migrated those to Teradata.
  • Performed Data Manipulation using MS Excel Pivot Sheets and produced various charts for creating the mock reports.
  • Configured & developed the triggers, workflows, validation rules & having hands on the deployment process from one sandbox to other.
  • Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy Oracle and SQL Server database systems.

Environment: Erwin 9.7, SQL, Oracle 12c, Azure, Agile, T-SQL, MDM, SSRS, ETL, SAS, SSIS, SSAS, MS Excel 2019.

Confidential, Eden Prairie, MN

Sr. Data Modeler/Data Analyst

Responsibilities:

  • As a Sr. Data Modeler / Data Analyst I was responsible for all data related aspects of a project.
  • Responsible for data warehousing, data modeling, data governance, standards, methodologies, guidelines and techniques.
  • Analyzed conceptual into logical data and had JAD sessions and communicated data related issues and standards.
  • Created DDL scripts using Erwin and source to target mappings to bring the data from source to the warehouse.
  • Used Agile Method for daily scrum to discuss the project related information.
  • Designed Physical & Logical Data Model using Erwin with the entities and attributes for each subject areas.
  • Interacted with DBA to discuss database design and modeling, index creations and SQL tuning issues.
  • Connected to AWS Redshift through Tableau to extract live data for real time analysis.
  • Developed test scripts for testing sourced data and their validation and transformation when persisting in data stores that are physical representations of the data models.
  • Designed normalized and star schema data architectures using Erwin and forward engineering these structures into Teradata
  • Designed and developed SAS macros, applications and other utilities to expedite SAS Programming activities.
  • Used normalization and de-normalization techniques to achieve optimum performance of the database.
  • Created tables, views, sequences, indexes, constraints and generated SQL scripts for implementing physical data model.
  • Designed the data marts using the Ralph Kimball & Dimensional Data Mart modeling methodology using Erwin.
  • Extracted Mega Data from Redshift AWS, and Elastic Search engine using SQL Queries to create reports.
  • Designed and implemented relational data model (OLTP) to support OAC database in Oracle.
  • Prepared data dictionary/business glossaries and also integrated Data dictionary into data models
  • Created mapping documents to aid with the development of more complex PL/SQL reports.
  • Assist with user testing of systems, developing and maintaining quality procedures, and ensuring that appropriate documentation is in place.
  • Worked on Amazon Redshift and AWS a solution to load data, create data models.
  • Performed extensive data profiling and data analysis for detecting and correcting inaccurate data from the databases and track the data quality.
  • Provided guidance and solution concepts for multiple projects focused on data governance and master data management (MDM).
  • Performed performance improvement of the existing Data warehouse applications to increase efficiency of the existing system.
  • Used SQL, PL/SQL to validate the Data going in to the Data warehouse.
  • Worked on Unit Testing for three reports and created SQL Test Scripts for each report as required.
  • Integrated various sources in to the Staging area in Data warehouse to Integrating and Cleansing data.
  • Involved in extracting, cleansing, transforming, integrating and loading data into different Data Marts using Data Stage Designer.
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and bulk collects.
  • Worked on Data governance, data quality, data lineage establishment processes.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Assisted project with analytical techniques including data modeling, data mining techniques, regression and hypothesis to get output from large data sets.

Environment: Erwin 9.7, SQL, AWS, Tableau, Teradata r15, SAS, OLTP, PL/SQL, Oracle 12c, MDM.

Confidential - Greensboro, NC

Data Modeler/Data Analyst

Responsibilities:

  • Massively involved in Data Modeler/Analyst role to review business requirement and compose source to target data mapping documents.
  • Utilized Power Designer's forward/reverse engineering tools and target database schema conversion process
  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Applied a dimensional model structure to archive an agile data model.
  • Mapped business needs/requirements to subject area model and to logical enterprise model.
  • Wrote PL/SQL statement, stored procedures and Triggers in DB2 for extracting as well as writing data.
  • Defined facts, dimensions and designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology.
  • Implemented Snow-flak schema to ensure no redundancy in the database.
  • Conducted Design discussions and meetings to come out with the appropriate Data Warehouse at the lowest level of grain for each of the Dimensions involved.
  • Created Entity Relationship Diagrams (ERD), Functional diagrams, Data flow diagrams and enforced referential integrity constraints.
  • Worked on normalization techniques. Normalized the data into 3rd Normal Form (3NF).
  • Created the XML control files to upload the data into Data warehousing system.
  • Designed scripts in SAS to be compatible with Teradata, to load and access data from the Teradata tables.
  • Created and maintained Logical Data Model (LDM) for the project. Includes documentation of all entities, attributes, data relationships, primary and foreign key structures
  • Performed Unit testing and UAT testing for various reports created from the data marts.
  • Developed scripts that automated DDL and DML statements used in creations of databases, tables, constraints, and updates.
  • Created data masking mappings to mask the sensitive data between production and test environment.
  • Worked on forward and reverse engineering the DDL for the SQL Server, DB2 and Teradata environments.
  • Created a Data Mapping document after each assignment and wrote the transformation rules for each field as applicable.
  • Tuned SQL applications and added Indices as required for performance and involved in testing through test scripts and test plans.
  • Created conceptual, logical and physical models for OLTP, Data Warehouse, Data Vault and Data Mart, Star/Snowflake schema implementations.
  • Worked on resolving many to many relationships, bridge tables, reference tables and master data etc.
  • Used MS Excel and Teradata for data pools and Ad-hoc reports for business analysis.

Environment: Power Designer 16.6, Agile, PL/SQL, XML, SAS, Teradata r15, SQL, OLTP, MS Excel.

Confidential - Columbus, GA

Data Analyst/Data Modeler

Responsibilities:

  • Worked as a Data Analysts/Data Modeler to understand Business logic and User Requirements.
  • Conducted JAD Sessions with the stakeholders and other management teams in the finalization of the User Requirement Documentation.
  • Analyzed the business requirements of the project by studying the Business Requirement Specification document.
  • Presented the data scenarios via, ER/Studio logical models and excel mockups to visualize the data better.
  • Performed Normalization of the existing OLTP systems (3rd NF), to speed up the DML statements execution time.
  • Involved in data mapping document from source to target and the data quality assessments for the source data.
  • Designed and Developed Oracle PL/SQL and Shell Scripts, Data Import/Export, Data Conversions and Data Cleansing.
  • Created logical Data model from the conceptual model and its conversion into the physical database design using ER/Studio.
  • Designed the data marts in dimensional data modeling using star and snowflake schemas.
  • Involved in extensive data validation by writing several complex SQL queries and Involved in back-end testing and worked with data quality issues.
  • Extensively involved in development and implementation of SSIS, SSRS and SSAS applications.
  • Involved in Normalization / De-normalization, Normal Form and data base design methodology.
  • Designed stunning visualization dashboards using Tableau desktop and publishing dashboards on Tableau server.
  • Extensively involved in data analysis and modeling for the OLAP and OLTP environment.
  • Written T-SQL statements for retrieval of data and Involved in performance tuning of T-SQL queries and Stored Procedures.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Conducted user interviews, gathering requirements, analyzing the requirements using Rational Rose, Requisite pro RUP.
  • Created Data Stage Server jobs to load data from sequential files, flat files and MS Access
  • Used ad hoc queries and Pandas for querying and analyzing the data, participated in performing data profiling, data analyzing, data validation and data mining.

Environment: ER/Studio 14, Oracle 11g, SQL, PL/SQL, OLAP, OLTP, SSIS, SSRS, SSAS, T-SQL, Tableau.

Confidential

Data Analyst

Responsibilities:

  • Acted as a Strong Data Analyst analyzing the data from low level in conversion projects, provided mapping documents between Legacy, Production and User Interface systems.
  • Worked with business requirements analysts/subject matter experts to identify and understand requirements.
  • Conducted user interviews and data analysis review meetings.
  • Conducted JAD sessions with stakeholders and software development team to analyze the feasibility of needs.
  • Worked with business analysts by providing specific and crucial information through data mining using SQL, and UNIX
  • Developed use-cases, test-scripts and documentation for User Acceptance testing(UAT) sessions
  • Consulted with project manager and tech lead to perform Gap analysis and impact analysis.
  • Generated data extracts in Tableau by connecting to the view using Tableau MySQL connector.
  • Worked with data investigation, discovery and mapping tools to scan every single data record from many sources.
  • Worked with supporting business analysis and marketing campaign analytics with data mining, data processing, and investigation to answer complex business questions.
  • Performed Data profiling, preliminary data analysis and handle anomalies such as missing, duplicates, outliers, and imputed irrelevant data.
  • Performed data queries with SQL and prepared monthly financial reports spreadsheets and pivot tables.
  • Documented Detailed Business Requirements for Automated Management, Automated Financial Reporting and Automated Process Notifications to relevant departments
  • Performed Verification, Validation, and Transformations on the Input data (Text files) before loading into target database
  • Conducted data cleansing and structuring to perform analysis and learn how engaged participants form different networks using R and Tableau.
  • Management of SQL Databases and Data Warehouse organize, format, validate and perform queries on large data sets.
  • Created presentations for data reporting by using pivot tables, VLOOKUP and other advanced Excel functions.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Responsible for different Data mapping activities from Source systems to Teradata
  • Performed ad-hoc analyses, as needed, with the ability to comprehend analysis as needed

Environment: SQL, UNIX, Tableau, MySQL, MS Excel 2012.

Hire Now