We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

0/5 (Submit Your Rating)

Boston, MA

SUMMARY

  • Over 9+ years of extensive experience in Software Development including Data Analysis and Data Modeling for Online Transaction Processing (OLTP), Online Analysis Processing (OLAP) systems and other formats.
  • Experience in Design, Development, Testing and Maintenance of various Data Warehousing and Business Intelligence (BI) applications in complex business environments.
  • Well versed in Conceptual, Logical/Physical, Relational and Multi - dimensional modeling, Data analysis for Decision Support Systems (DSS), Data Transformation (ETL) and Reporting.
  • Proficient in developing Entity-Relationship diagrams, Star/Snow Flake Schema Designs, and expert in modeling Transactional Databases and Data Warehouse.
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Data Profiling, Data Mapping, Performance Tuning and System Testing.
  • Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
  • Efficient in Dimensional Data Modeling for Data Mart design, identifying Facts and Dimensions, creation of cubes.
  • Hands on experience in SQL, PLSQL programming, performed End-to-End ETL validations and supported Ad-hoc business requests. Developed Stored Procedures and Triggers and extensively used Quest tools like TOAD.
  • Good understanding of Ralph Kimball (Dimensional) & Bill Inmon (Relational) model Methodologies.
  • Strong experience in Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration management.
  • Strong Experience in working with Excel Pivot and VBA Macros for various business scenarios.
  • Experience in generating DDL (Data Definition Language) Scripts and creating Indexing strategies.
  • Excellent SQL Programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL SQL, Performance tuning and query optimization techniques in transactional and data warehouse environments.
  • Experience in Data analysis and Data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Experience in dashboard reports using SQL Server reporting services (SSRS).
  • Expert in Agile/Scrum and waterfall methodologies.
  • Experience in developing Entity Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Hands on experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop, and Flume.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.

TECHNICAL SKILLS

Data Modeling Tools: Erwin Data Modeler, Erwin Model Manager, ER Studio v17, and Power Designer 16.6.

Programming Languages: SQL, PL/SQL, HTML5, C++, JAVA, PHP 7.2, XML and VBA.

Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

Big Data technologies: HBase 1.2, HDFS, Sqoop 1.4, Spark, Hadoop 3.0, Hive 2.3

Cloud Management: Amazon Web Services(AWS), Redshift

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Operating System: Windows, Unix, Sun Solaris

ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend, and Pentaho.

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

AWS tools: EC2, S3 Bucket, AMI, RDS, Redshift.

PROFESSIONAL EXPERIENCE

Confidential - Boston, MA

Sr. Data Modeler/Data Analyst

Responsibilities:

  • As a Data Modeler/Data Analyst I was responsible for all data related aspects of a project.
  • Translated business and data requirements into data models in support of Enterprise Data Models, Data Warehouse and Analytical systems.
  • Created Logical & Physical Data Model on Relational (OLTP) on Star schema for Fact and Dimension tables using Erwin.
  • Developed Data Warehouse Dimensional modeling and maintain the Data Warehouse Bus Matrix Architecture.
  • Worked on master data (entities and attributes) and capture how data is interpreted by users in various parts of the organization.
  • Conducted JAD sessions with management, vendors, users and other stakeholders for open and pending issues to develop specifications.
  • Performed rigorous data analysis and data discovery and data profiling.
  • Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
  • Used Agile Method for daily scrum to discuss the project related information.
  • Maintained Data Mapping documents, Bus Matrix and other Data Design artifacts that define technical data specifications and transformation rules.
  • Understand database performance factors and trends pertaining to very large database design.
  • Collaborate with DBAs to implement mitigating physical modeling solutions.
  • Provided data structures optimized for information entry and retrieval.
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Worked on normalization techniques, normalized the data into 3rd Normal Form (3NF).
  • Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches and implemented Slowly Changing Dimensions.
  • Performed detailed data analysis to analyze the duration of claim processes.
  • Created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Used reverse engineering to connect to existing database and create graphical representation (E-R diagram).
  • Create and maintain the metadata (data dictionary) for the data models.
  • Designed OLTP system environment and maintained documentation of Metadata.
  • Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.
  • Extensive experience in PL/SQL programming Stored Procedures, Functions, Packages and Triggers
  • Managed database design and implemented a comprehensive Star-Schema with shared dimensions.
  • Create and publish regularly scheduled ad hoc reports as needed.
  • Involved in data lineage and Informatica ETL source to target mapping development, complying with data quality and governance standards.
  • Designed, developed data integration programs in a Hadoop environment with NoSQL data store Cassandra for data access and analysis.
  • Used Pig to extract, write complex data transformations, cleaning and processing of large data sets and storing data in HDFS.
  • Wrote and executed unit, system, integration and UAT scripts in a Data Warehouse projects.
  • Assisted in defining business requirements and created BRD (Business Requirements Document) and functional specifications documents.
  • Supported development team & QA team during process design and during performance tuning, Test Strategy and test case development.

Environment: OLTP, Erwin v9.7, Agile, MDM, Oracle 12c, SQL, XML, AWS, Cassandra 3.11, HDFS, Python 3.6, Redshift, Pig 0.17, NoSQL, Hadoop 3.0 and PL/SQL

Confidential - Troy, NY

Sr. Data Modeler/Data Analyst

Responsibilities:

  • Gathered source data and perform data preparation / data cleaning for modeling and analysis.
  • Developed and maintain enterprise and system level conceptual, logical, and physical data models and data dictionaries.
  • Work with SMEs, Business Analysts and Technology teams to understand the data requirements and the full attribute set for entities and dimensions.
  • Extensively used Agile methodology as the Organization Standard to implement the data Models.
  • Convert business processes, domain specific knowledge, and information needs into a conceptual model.
  • Convert conceptual models into logical models with detailed descriptions of entities and dimensions using ER/Studio.
  • Integrated disparate data models into coherent enterprise data models.
  • Created and revising data integration modeling standards and guidelines.
  • Involved in Mapping source and target data, as well as performing logical model to physical model mapping.
  • Worked with Data Steward Team for designing, documenting and configuring Informatica Data Director for supporting management of MDM data.
  • Developed data models and data migration strategies utilizing of data modeling including star schema, snowflake schema.
  • Created reusable error-free DDL scripts for the promotion of structural changes to SQL Server for development, test and production.
  • Designed and developed data mapping and transformation scripts to support data warehouse development and data analytics efforts.
  • Developed extract, transform and load (ETL) logic and code in support of data warehouse and analytics operations and maintains related data pipelines.
  • Designed and developed a Data Lake using Hadoop for processing raw and processed claims via Hive.
  • Executed Hive queries on Parquet tables stored in Hive to perform data analysis to meet the business requirements.
  • Performed data analysis using SQL queries.
  • Performed ad hoc data analysis with the use of statistical analysis and data mining techniques.
  • Implemented Forward engineer physical models to create DDL scripts to implement new databases or database changes.
  • Implemented Reverse engineer databases and synchronize data models with actual data implementations.
  • Created data dictionaries and business glossaries to document data lineages, data definitions and metadata for all business-critical data domains.
  • Developed canonical models and data transformation rules using XML.
  • Created and Manage schema objects such as tables, views, indexes, User define function and Store procedures using T-SQL.
  • Developed advanced SQL queries with multi-table joins, group functions, sub-queries, set operations and T-SQL stored procedures, user defined functions (UDFs) for data analysis.
  • Created SQL reports, data extraction and data loading scripts for different databases and Schemas.
  • Developed Dashboard Reports using SQL Server Reporting Services (SSRS), Report Model using Report Builder.
  • Created and executed SQL statements in both SQL production and development environments to evaluate data correctness.
  • Successfully deployed reports in various sources like XML, Web browser, PDF.

Environment: Agile, ER/Studio v17, ETL, SQL Server 2016, Hadoop 3.0, Data Lake, Hive 2.3, Metadata, XML, T-SQL, SSRS, Sql, MDM, pl/Sql.

Confidential - Tampa, FL

Sr. Data Analyst

Responsibilities:

  • Analyze the OLTP Source Systems and Operational Data Store and research the tables/entities required for the project.
  • Designing the measures, dimensions and facts matrix document for the ease while designing.
  • Created data flowcharts and attribute mapping documents, analyzed the source meaning to retain and provide proper business names following the very stringent FTB's data standards.
  • Developed several scripts to gather all the required data from different databases to build the LAR file monthly.
  • Developed numerous reports to capture the transactional data for the business analysis.
  • Developed complex SQL queries to bring data together from various systems.
  • Organized and conducted cross-functional meetings to ensure linearity of the phase approach.
  • Collaborated with a team of Business Analysts to ascertain capture of all requirements.
  • Created multiple reports on the daily transactional data which involves millions of records.
  • Used Joins like Inner Joins, Outer joins while creating tables from multiple tables.
  • Created Multi set, temporary, derived and volatile tables in Teradata database.
  • Implemented Indexes, Collecting Statistics, and Constraints while creating tables.
  • Utilized ODBC for connectivity to Teradata via MS Excel to retrieve automatically from Teradata Database.
  • Developed various ad hoc reports based on the requirements
  • Designed & developed various Ad hoc reports for different teams in Business (Teradata and Oracle SQL, MS access, MS excel)
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and formatted the results into reports and kept logs.
  • Involved in writing complex SQL queries using correlated sub queries, joins, and recursive queries.
  • Delivered the artifacts within the time lines and excelled in the quality of deliverables.
  • Validated the data during UAT testing.
  • Performing source to target Mapping
  • Involved in Metadata management, where all the table specifications were listed and implemented the same in Ab Initio metadata hub as per data governance.
  • Developed Korn Shell scripts to parallel extract and process data from different sources simultaneously to streamline performance and improve execution time in a parallel process for better time, resource management and efficiency.
  • Used Teradata utilities such as TPT (Teradata Parallel Transporter), FLOAD (Fastload) and MLOAD (Multiload) for handling various tasks.
  • Developed Logical data model using Erwin and created physical data models using forward engineering.

Environment: Erwin 8.0, Teradata 13, TOAD, Oracle 10g/11g, MS SQL Server 2008, Teradata SQL Assistant, XML Files, Flat files

Confidential - Portsmouth, NH

Sr. Data Analyst/Data Modeler

Responsibilities:

  • As a Data Analyst/Data Modeler I was responsible for all data related aspects of a project.
  • Worked with DBAs and the security coordinators to get access to the team members.
  • Participated in requirement gathering sessions, conducted JAD sessions with users, subject matter experts and business analysts.
  • Developed conceptual model using Erwin based on business requirements.
  • Developed and normalized logical and physical data base model using OLTP systems for finance applications.
  • Extensively used normalization techniques (up to 3NF).
  • Produced functional decomposition diagrams and defined logical data model.
  • Involved in redesigning of the existing OLTP systems, modification and designing new requirements in the existing systems.
  • Involved in designing the context flow diagrams, structure chart and ER-diagrams.
  • Performed forward engineering to create a physical SAS model with DDL, based on the requirements from logical data model.
  • Developed the code as per the client's requirements using SQL, PL/SQL and Data Warehousing concepts.
  • Wrote T-SQL statements for retrieval of data and Involved in performance tuning of T-SQL queries and Stored Procedures.
  • Involved in designing and developing SQL server objects such as Tables, Views, Indexes (Clustered and Non-Clustered), Stored Procedures and Functions in Transact-SQL.
  • Involved with data profiling for multiple sources and answered complex business questions by providing data to business users.
  • Implemented referential integrity using primary key and foreign key relationships.
  • Involved in development and implementation of SSIS, SSRS and SSAS applications.
  • Performed extensive data analysis and data validation on Teradata.
  • Generated ad hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy oracle and SQL server database systems.
  • Assisted Reporting developers in building Reports using Crystal Reports.
  • Used Erwin for reverse engineering to connect to existing database and ODS to create graphical representation in the form of Entity Relationships and elicit more information.
  • Acted as strong Data Analyst analyzing the data from low level in conversion projects, provided mapping documents between legacy, production and user interface systems.
  • Used Model Mart of Erwin for effective model management of sharing, dividing and reusing model information and design for productivity improvement.
  • Interacted with client, management and staff to identify and document business needs and objectives, current operational procedures for creating the logical data model.
  • Facilitated in developing testing procedures, test cases and User Acceptance Testing (UAT).
  • Extensively used SSIS import/export wizard for performing the ETL operations.

Environment: JAD, Erwin7.5, OLTP, 3NF, DDL, T-SQL, SSRS, SSAS, Teradata13, Crystal Reports7.0, ODS, UAT

Confidential

Data Analyst

Responsibilities:

  • Gathered and translated business requirements into detailed, production-level technical specifications, new features, and enhancements to existing technical business functionality.
  • Used the Waterfall methodology to build the different phases of Software development life cycle.
  • Participated in all phases of data mining, data collection, data cleaning, developing models, validation, and visualization and performed Gap analysis
  • Demonstrated experience in design and implementation of Statistical models, Predictive models, enterprise data model, metadata solution and data life cycle management.
  • Data analysis and reporting using MS Power Point, MS Access and SQL assistant.
  • Generated periodic reports based on the statistical analysis of the data using SQL Server Reporting Services.
  • Developed SQL scripts for creating tables, Sequences, Triggers, views and materialized views.
  • Compiled data from various sources public and private databases to perform complex analysis and data manipulation for actionable results.
  • Used and maintained database in MS SQL Server to extract the data inputs from internal systems.
  • Interacted with the Client and documented the Business Reporting needs to analyze the data.
  • Used SAS for pre-processing data, SQL queries, data analysis, generating reports, graphics, and statistical analyses.
  • Migrated database from legacy systems, SQL server to Oracle.
  • Performed data analysis, statistical analysis, and generated reports, listings and graphs using SAS tools-SAS/Base, SAS/Macros and SAS graph, SAS/SQL, SAS/Connect, and SAS/Access.
  • Developed data mapping documentation to establish relationships between source and target tables including transformation processes using SQL.
  • Extensive data cleansing and analysis, using pivot tables, formulas (V-lookup and others), data validation, conditional formatting, and graph and chart manipulation using excel.
  • Created pivot tables and charts using worksheet data and external resources, modified pivot tables, sorted items and group data, and refreshed and formatted pivot tables.
  • Used advanced Microsoft Excel to create pivot tables, used VLOOKUP and other Excel functions.
  • Worked on CSV files while trying to get input from the MySQL database.
  • Created functions, triggers, views and stored procedures using MySQL.
  • Worked on database testing, wrote complex SQL queries to verify the transactions and business logic.
  • Worked on data profiling and data validation to ensure the accuracy of the data between the warehouse and source systems.
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and Bulk collects.

Environment: Erwin 8.0, SDLC, MS Power Point, MS Access, MS SQL Server2008, SAS, Oracle10g, Microsoft Excel

We'd love your feedback!