We provide IT Staff Augmentation Services!

Sr. Data Modeler/data Analyst Resume

5.00/5 (Submit Your Rating)

Irving, TX

SUMMARY:

  • 6 years of Experience in Data Analysis, Data Modeling, Data Warehouse & Business intelligence professional with applied information Technology.
  • Experience in Design, Development, Testing and Maintenance of various Data Warehousing and Business Intelligence (BI) applications in complex business environments.
  • Well versed in Conceptual, Logical/Physical, Relational and Multi - dimensional modeling, Data analysis for Decision Support Systems (DSS), Data Transformation (ETL) and Reporting.
  • Proficient in developing Entity-Relationship diagrams, Star/Snow Flake Schema Designs, and expert in modeling Transactional Databases and Data Warehouse.
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Data Profiling, Data Mapping, Performance Tuning and System Testing.
  • Proficient in Normalization/De-normalization techniques in relational/dimensional database environments and have done normalizations up to 3NF.
  • Efficient in Dimensional Data Modeling for Data Mart design, identifying Facts and Dimensions, creation of cubes.
  • Experienced in Technical consulting and end-to-end delivery with architecture, data modeling, data governance and design - development - implementation of solutions.
  • Hands on experience in SQL, PL/SQL programming, performed End-to-End ETL validations and supported Ad-hoc business requests. Developed Stored Procedures and Triggers and extensively used Quest tools like TOAD.
  • Good understanding of Ralph Kimball (Dimensional) & Bill Inmon (Relational) model Methodologies.
  • Designed and implemented Azure Stream Analytics (ASA) jobs and PowerBI reports for an IoT Analytics platform using
  • 3 years of Azure Developer and Azure admin experience. Experience in managing AzureSQL with DBA activities.
  • Experience with Teradata utilities such as Fast Export, MLOAD for handling various tasks.
  • Strong experience in Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration management.
  • Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning and System Testing.
  • Strong Experience in working with Excel Pivot and VBA Macros for various business scenarios.
  • Experience in generating DDL (Data Definition Language) Scripts and creating Indexing strategies.
  • Developed solution to copy on-premises data warehouse to Azure using Data Factory, Logic Apps, and Azure SQL
  • Hand on Experience working with AWS RDS, Creating, updating and backups of Databases
  • Excellent SQL Programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL/PL SQL, Performance tuning and query optimization techniques in transactional and data warehouse environments.
  • Experience in Data analysis and Data profiling using complex SQL on various sources systems including Oracle and Teradata.
  • Experience in dashboard reports using SQL Server reporting services (SSRS).
  • Expert in Agile/Scrum and waterfall methodologies.
  • Working with AWS Infrastructure, with numerous services such as S3, EC2, RDS, CloudWatch, IAM, VPC and Route 53 etc.
  • Experienced with event-driven and scheduled AWS Lambda functions to trigger various AWS resources.
  • Experience in developing Entity Relationship diagrams and modeling Transactional Databases and Data Warehouse using tools like ERWIN, ER/Studio and Power Designer.
  • Strong experience in using Excel and MS Access to dump the data and analyze based on business needs.
  • Good understanding of Access queries, V-Lookup, formulas, Pivot Tables, etc.
  • Working knowledge of CRM Automation Salesforce.com, SAP.
  • Hands on experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop, and Flume.
  • Experience in developing Map Reduce Programs using Apache Hadoop for analyzing the big data as per the requirement.

TECHNICAL SKILLS:

Data Modeling Tools: Erwin Data Modeler 9.7/9.6, ER Studio v17, and Power Designer.

Big Data technologies: HBase 1.2, HDFS, Sqoop 1.4, Spark, Hadoop 3.0, Hive 2.3

AWS tools: EC2, S3 Bucket, AMI, RDS, Redshift

ETL/Data warehouse Tools: Informatica 9.6/9.1, SAP Business Objects XIR3.1/XIR2, Talend and Pentaho.

SSRS, Power BI, Tableau, SSAS, MS: Excel, SAS BI Platform.

UNIX Shell Scripting, HTML, T: SQL, Data Structure, Algorithms.

OLAP Tools: Tableau 7, SAP BO, SSAS, Business Objects, and Crystal Reports 9

Databases: Oracle 12c/11g, Teradata R15/R14, MS SQL Server 2016/2014, DB2.

Operating System: Windows, Unix, Sun Solaris

Other tools: SQL PLUS, SQL LOADER, MS Project, MS Visio, UNIX, PL/SQL etc

Methodologies: RAD, JAD, RUP, UML, System Development Life Cycle (SDLC), Agile, Waterfall Model.

PROFESSIONAL EXPERIENCE:

Sr. Data Modeler/Data Analyst

Confidential, Irving, TX

Responsibilities:

  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Created Logical & Physical Data Model on Relational (OLTP) on Star schema for Fact and Dimension tables using Erwin.
  • Performed GAP analysis to analyze the difference between the system capabilities and business requirements.
  • Developing SSIS packages to integrate them into Azure Data Factory V2.
  • Used Agile Method for daily scrum to discuss the project related information.
  • Prepared ETL technical Mapping Documents along with test cases for each Mapping for future developments to maintain Software Development Life Cycle (SDLC).
  • Involved in Data flow analysis, Data modeling, Physical database design, forms design and development, data conversion, performance analysis and tuning.
  • Created sampling specification, performed quality assurance, created multiple datasets to communicate with Medicare Advantage plans and analyzed Medicare plans response.
  • Created and maintained data model standards, including master data management (MDM) and Involved in extracting the data from various sources like Oracle, SQL, Teradata, and XML.
  • Designed the data marts using the Ralph Kimball's Dimensional Data Mart modeling methodology using Erwin.
  • Used AWS Cloud Watch, Nagios as a performance monitoring and analytics tools.
  • Worked on Master data Management (MDM) Hub and interacted with multiple stakeholders.
  • Worked with medical claim data in the Oracle database for Inpatient/Outpatient data validation, trend and comparative analysis.
  • Design and implementation of an Azure data lake
  • Design and development of the data warehouse data ETL Jobs using azure data factory and azure databricks.
  • Proficient in developing Entity-Relationship diagrams, Star/Snow Flake Schema Designs, and expert in modeling Transactional Databases and Data Warehouse.
  • Worked on normalization techniques, normalized the data into 3rd Normal Form (3NF).
  • Worked Normalization and De-normalization concepts and design methodologies like Ralph Kimball and Bill Inmon approaches and implemented Slowly Changing Dimensions.
  • Implemented Forward engineering to create tables, views and SQL scripts and mapping documents.
  • Responsible for all BI Team Azure based solutions along with developing strategy on implementation
  • Used reverse engineering to connect to existing database and create graphical representation (E-R diagram).
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies under guidance of several Data Architect.
  • Azure logic app development to trigger the workflow on Azure SQL DB
  • Performance tuning and stress-testing of NoSQL database environments in order to ensure acceptable database performance in production mode.
  • Developed Data Mapping, Data Governance, and Transformation and cleansing rules for the Master Data Management Architecture involving OLTP, ODS.
  • Assisted in defining business requirements and created BRD (Business Requirements Document) and functional specifications documents.
  • Developed automated data pipelines from various external data sources (web pages, API etc) to internal data warehouse (SQL server) then export to reporting tools like Datorama by Python.
  • Connected to AWS Redshift through Tableau to extract live data for real time analysis.
  • Performed data analysis and data profiling using complex SQL on various sources systems including Oracle and Netezza
  • Monitored the Data quality and integrity of data was maintained to ensure effective functioning of department.
  • Managed database design and implemented a comprehensive Star-Schema with shared dimensions.
  • Analyzed the data which is using the maximum number of resources and made changes in the back-end code using PL/SQL stored procedures and triggers
  • Developed and maintained stored procedures, implemented changes to database design including tables and views and Documented Source to Target mappings as per the business rules.
  • Conducted Design reviews with the business analysts and the developers to create a proof of concept for the reports.
  • Performed detailed data analysis to analyze the duration of claim processes and created the cubes with Star Schemas using facts and dimensions through SQL Server Analysis Services (SSAS).
  • Deployed SSRS reports to Report Manager and created linked reports, snapshots, and subscriptions for the reports and worked on scheduling of the reports.
  • Generated parameterized queries for generating tabular reports using global variables, expressions, functions, and stored procedures using SSRS.
  • Developed, and scheduled variety of reports like cross-tab, parameterized, drill through and sub reports with SSRS.
  • Directing and overseeing data quality tests, including providing input to quality assurance team members.

Environment: Erwin v9.7, Agile, NoSQL, Big Data, Azure, Redshift, MDM, Oracle 12c, SQL, Teradata r15, XML, Python 3.6, PL/SQL, Tableau, SSRS.

Sr. Data Analyst/ Modeler

Confidential, Dorchester, MA

Responsibilities:

  • Used SQL extensively to query Databases
  • Sales data analysis and the maintenance of pricing tables that feed live website
  • Conducted data assessments of legacy data
  • Used Excel, Access and VBA in the development of robust and flexible reporting systems
  • Retrieved data from data warehouse and generated a series of meaningful business reports using SSRS and Cognos.
  • Deploying and maintaining production environment using AWS EC2 instances and Elastic Container Services with Docker
  • Monitoring resources and Applications using AWS Cloud Watch, including creating alarms to monitor metrics such as EBS, EC2, ELB, RDS, S3, SNS and configured notifications for the alarms generated based on events defined
  • Perform troubleshooting and monitoring of the Linux server on AWS.
  • Maintaining the user accounts (IAM), RDS, Route 53, SES and SNS services in AWS cloud
  • Validated and tested reports, then published the reports to the report server.
  • Performed and conducted complex custom analytics as needed by clients.
  • Designed and developed specific databases for collection, tracking and reporting of data.
  • Established, maintained and distributed daily, weekly and monthly reports
  • Created financial models using Excel Pivot Tables and formulas to develop best and worst case scenarios
  • Creation of user accounts, policies and roles on Amazon Web Services (AWS) using IAM
  • Data auditing, creating data reports & monitoring all data for accuracy
  • Data cleansing - checking for data redundancy, duplicates and reporting on all findings
  • Assisted in the development of Metrics to measure the business data cleansing progress
  • Performed and conducted complex custom analytics as needed by clients.

Environment: Erwin9.0, AWS, Redshift, Oracle11g, Sql Server 2010, Teradata14, XML, OLTP, PL/Sql, Linux, UNIX, Mload, BTEQ, UNIX shell scripting

Data Analyst

Confidential

Responsibilities:

  • Played an active and lead role in gathering, analyzing and documenting the Business Requirements, the business rules and technical requirements from the Business Group and the Technology Group.
  • Drafted enterprise security standards and guidelines
  • Performed and created procedures for system security audits
  • Assisted in the development of database access controls and separation of duties
  • Assisted in incident response and disaster recovery
  • Responsible for the documentation and execution of remediation plans
  • Performed periodic security review of privileged accounts
  • Performed a thorough analysis on the Sales Order Entry process to eliminate bottleneck
  • Captured As-Is Process, designed TO-BE Process and performed Gap Analysis
  • Gathered Requirements from different sources like Stakeholders, Existing Systems, and Subject Matter Experts by conducting Workshops, Interviews, Use Cases, Prototypes
  • Created Functional Requirement Specification documents - which include UML Use case diagrams, scenarios, activity, work Flow diagrams and data mapping. Process and Data modelling with MS VISIO.
  • Worked with the DW, ETL teams to create Order entry systems Business Objects reports.
  • Worked in a cross functional team of Business Analysts, Architects and Developers to implement new features in various custom applications.
  • Developed and maintained various User Manuals and Application Documentation Manuals on Share Point tool.
  • Created Test Plans and Test Strategies to define the Objective and Approach of testing.
  • Used Quality Center to track and report system defects and bug fixes, wrote modification requests for the bugs in the application and helped developers to track and resolve the problems.
  • Developed and Executed Manual, Automated Functional, GUI, Regression, UAT Test cases using QTP.
  • Gathered, documented and executed Requirements-based, Business process (workflow/user scenario), Data driven test cases for User Acceptance Testing.
  • Created Test Matrix, Used Quality Center for Test Management, track & report system defects and bug fixes.
  • Performed Load, stress testing and analyzed performance/response Times.
  • Created / developed SQL Queries with several parameters for Backend Database testing.

Environment: Erwin 8.0, MS Excel, MS Access, UNIX, T-Sql, MS Sql 2008, Excel, SSIS, SSRS, MS Visio.

We'd love your feedback!