We provide IT Staff Augmentation Services!

Data Analyst Resume

4.00/5 (Submit Your Rating)

Roseville, CA

SUMMARY

  • 8 years of progressive IT experience in the field of Business Requirement Analysis, Data Analysis, Testing of Data warehousing & Database, ETL Development, and Data Modeling.
  • Good experience in evaluating business systems for user needs, business modeling and document processing.
  • Strong background in designing various Logical and Physical Data Models using Erwin, MS Visio, ER Studio, Power Designer and toad data modeler.
  • Extensive work experience on Online Transaction Processing (OLTP) and Online Analytical Processing (OLAP) system environment.
  • Expert knowledge in SDLC (Software Development Life Cycle) and was involved in all phases in projects process and familiar with Agile, Scrum, and Waterfall methodologies.
  • Solid experience in Relational Modeling, Dimensional Modeling, Conceptual Modeling, Creating Logical Modeling, Physical Modeling, Data warehousing, Fact Tables, Dimension Tables, Star Schema and Snowflakes Schema as per enterprise standards.
  • Experience in developing PL/SQL Scripts, stored procedures, Triggers, Views, and Indexes.
  • Experience in Extract, Transform and Load (ETL) data from spreadsheets, database tables and other sources using SQL Server Integration Services (SSIS) and SQL Server Reporting Service (SSRS) for managers and executives.
  • Analyze Format data using Machine Learning algorithm by Python Scikit - Learn.
  • Expertise in all facets of Business Intelligence applications with a strong background in Data extraction, data visualization, report generation, infographics and information visualization.
  • Strong understanding and usage of various type of dimensions like Junk Dimensions, Slowly Changing Dimensions, Role Playing and Degenerate Dimensions.
  • Experienced in working with RDBMS like Oracle 10g/ 9i/ 8i, Microsoft SQL Server 2008 R2 and Teradata.
  • Develop data-driven business and analytic strategies and visualizations for Fortune 100 companies, start-ups and NGOs. Specializing in Tableau Dashboard development, and turning big data into actionable insights.
  • Actively participated in all phases of the project life cycle including data acquisition, data cleaning and pre-processing, feature engineering, Exploratory data analysis, model building and testing and validation, data visualization and final presentation to the client
  • Proficient in designing of Star and Snowflake schemas with a very good understanding of fact and dimensional tables.
  • A very good knowledge with dealing different types of data sources such as flat files, Excel, Oracle, Sybase, and SQL Server.
  • Working closely with ETL/ Report Developers, Database administrators, Business Analysts and support teams.
  • Built models using Statistical techniques like Bayesian HMM and Machine Learning classification models like XG Boost, SVM, and Random Forest.
  • Well versed in Conceptual, Logical/Physical, Relational and Multi-dimensional modeling, Data analysis for Decision Support Systems (DSS), Data Transformation and Reporting.
  • Exposure to Both Ralph Kimball and Bill Inmon Data Warehousing Approaches.
  • Experience in building reports using SQL Server Reporting Services and Crystal Reports.
  • Experience in performing Reverse Engineering of Physical Data Models from data, SQL scripts.
  • Experience in using SSIS in solving complex business problems.
  • Expertise in SQL Server Analysis Services (SSAS) and SQL Server Reporting Services (SSRS) tools
  • Strong working Experience with Agile, Scrum, Kanban and Waterfall methodologies.
  • Capture, validate and publish metadata in accordance with enterprise data governance policies and MDM taxonomies.
  • Extensive practical knowledge on business process flow related to Conventional Insured, Conventional Uninsured (Fixed, ARM and FHA), VA, NACA, Sub-Prime Lending, HELOC Piggyback, HELOC Refinance, HELOANS, LIBOR and State Specific Mortgage Home Loans etc.
  • Extensive experience in development of T-SQL, Oracle PL/SQL Scripts, Stored Procedures and Triggers for business logic implementation.
  • Work experience in Informatica EDC 10.2, Axon 5.4/6.0 & Azure cloud infrastructure & IICS.
  • Hands on experience in analyzing data using Hadoop Ecosystem including HDFS, Hive, Streaming, Elastic Search, Kibana, Kafka, HBase, Zookeeper, PIG, Sqoop, and Flume.
  • Good Knowledge on SQL queries and creating database objects like stored procedures, triggers, packages and functions using SQL and PL/SQL.
  • Experience in testing Business Intelligence reports generated by various BI Tools like Cognos and Business Objects.
  • Involved in conducting trainings to users on interactions, filters, sort and customization of views on an existing visualization generated through Tableau desktop.
  • Experience in generating DDL (Data Definition Language) Scripts and creating Indexing strategies.
  • Impressive presentation as well as communications skills with very good leadership qualities.

TECHNICAL SKILLS

Programming Languages: C, and C++, SQL, Python, R.

Scripting Languages: MS-DOS, Bash, Korn.

Reporting Tools: SSRS, Power BI, Tableau, SSAS, MS-Excel, SAS BI Platform.

Tools: Microsoft Azure, Alteryx, JMP, SAS, Anylogic, AutoCAD, Solidworks, Ansys, Adobe Experience Manager

OLAP Tools: Tableau, SAP BO, SSAS, Business Objects, and Crystal Reports

Data Modeling: Erwin Data Modeler, Erwin Model Manager, ER Studio v17, and Power Designer.

Application/Web Servers: JBoss, Glassfish 2.1, WebLogic, Web Sphere, Apache Tomcat Server.

MS-Office Package: Microsoft Office (Windows, Word, Excel, PowerPoint, Visio, Project).

Platform/Tools: Tableau, Power BI, Salesforce, QlikView, SSIS, SSRS, Advanced Microsoft Excel, ETL, RStudio, Outlook, Word, MS Access, VMware, Jira, Jupyter Notebook, Anaconda, IBM Mainframes, LPS

ETL Tools / Tracking tool: Informatica, SSIS, SSAS, SSRS / JIRA.

Databases: Teradata R12 R13 R14.10, MS SQL Server, DB2, Netezza

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe

PROFESSIONAL EXPERIENCE

Data Analyst

Confidential, Roseville, CA

Responsibilities:

  • Used Informatica Power centre for (ETL) extraction, transformation and loading data from heterogeneous source systems into the target database.
  • As a Sr. Data Modeler/Data Analyst I am responsible for all data related aspects of a project.
  • Developed full life cycle software including defining requirements, prototyping, designing, coding, testing and maintaining software.
  • Creating UI Mockups for the Data Visualizations and presenting to the Stakeholders.
  • Worked on Software Development Life Cycle (SDLC) with good working knowledge of testing, agile methodology, disciplines, tasks, resources and scheduling.
  • Developed long term data warehouse roadmap and architectures, designs and builds the data warehouse framework per the roadmap.
  • Worked on high priority enhancement change requests involving various Fannie Mae and Freddie Mac Special Lending Options (SLO) programs for Confirming Fixed, ARM, and Interest Only, LIBOR loans and State Specific Programs.
  • Perform slicing and dicing of data using SQL and Excel for data cleaning and data preparation
  • Developed logical data models and physical database design and generated database schemas using Erwin.
  • A highly immersive Data Science program involving Data Manipulation & Visualization, Web Scraping, Machine Learning, Python programming, SQL, GIT, Unix Commands, NoSQL, MongoDB, Hadoop.
  • Designed and developed various dashboards, reports utilizing advanced Tableau Visualizations like Waterfall Charts, Funnel Charts, Pareto Charts, Dual Axis, Blended Axes, Scatter Plots, Bar in Bar, Box Plot, Bullet Chart, Gantt Chart, Heat Map, Confidential Line, KPI Charts.
  • Transitioned and currently focused on the data governance vertical within Informatica and completed enablement on Enterprise Data Catalog (EDC) and Axon.
  • Perform data collection, data cleaning, data wrangling, data analysis and building machine learning model on the data sets in both R and Python
  • Designed and developed architecture for data services ecosystem spanning Relational, NoSQL, and Big Data technologies.
  • Designed both 3NF data models for ODS, OLTP systems and dimensional data models using star and snow flake Schemas.
  • Conducted JAD sessions with management, vendors, users and other stakeholders for open and pending issues to develop specifications.
  • Developed Data Mapping, Data Governance, and Transformation and Cleansing rules for the Master Data Management Architecture.
  • Extensively experience building Master Data Management (MDM) strategy in an organization.
  • Document data dictionaries and business requirements for key workflows and process points
  • Involved in Dimensional modeling (Star Schema) of the Data warehouse and used Power Designer to design the business process, dimensions and measured facts.
  • Designed ER diagrams and mapping the data into database objects.
  • Involved in Data profiling in order to detect and correct inaccurate data and maintain the data quality.
  • As a Data Visualization Consultant supporting Risk Consulting Team in Model Documentation and Data Visualizations for the Data Science Team which develops Credit and Market Risk Predictive Models for the US Banking clients.
  • Used pandas, numpy, seaborn, scipy, matplotlib, scikit-learn, NLTK in Python for developing various machine learning algorithms.
  • Worked on different data formats such as JSON, XML and performed machine learning algorithms in Python.
  • Excellent experience in migration of trail data using EDC systems.
  • Clinical Trial implementation using EDC applications like RAVE and Inform.
  • Conducted EDC trainings RAVE for end users like Johnson and Johnson, PRA, AstraZeneca, Celgene, Baxter.
  • Work around Tableau, IDQ, PWC, MDM, EDC, Entity 360, Axon and Collibra.
  • Researched and developed hosting solutions using MS Azure for service solution.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLAP, ODS, DW)
  • Implemented Forward Engineering by using DDL scripts and generating indexing strategies.
  • Reverse Engineered physical data models from SQL Scripts and databases.
  • Worked with Data Analytics, Data Reporting, Ad-hoc Reporting, Graphs, Scales, PivotTables and OLAP reporting.
  • Wrote and executed SQL queries to verify that data has been moved from transactional system to DSS, Data warehouse, data mart reporting system in accordance with requirements.
  • Generated various reports using SQL Server Report Services (SSRS) for business analysts and the management team.
  • Analyzed results from data validation queries to present to user groups.
  • Involved in data lineage and Informatica ETL source to target mapping development, complying with data quality and governance standards.
  • Designed, developed data integration programs in a Hadoop environment with NoSQL data store Cassandra for data access and analysis.
  • Used Pig to extract, write complex data transformations, cleaning and processing of large data sets and storing data in HDFS.
  • Prepare and test data for international metric including Wholesale credit, Investments, oil, gas, Libor, credit cards and commercial information
  • Wrote and executed unit, system, integration and UAT scripts in a Data Warehouse projects.
  • Assisted in defining business requirements and created BRD (Business Requirements Document) and functional specifications documents.
  • Supported development team & QA team during process design and during performance tuning, Test Strategy and test case development.
  • Responsible for the data architecture design delivery, data model development, review, approval and Data warehouse implementation.
  • Created/Executed Rules in IDQ Analyst/Developer and projected results in Axon Quality Dashboard
  • Devised target state architecture for LIBOR TO SOFR Transition.

Environment: Erwin 9.7, Agile, MDM, PL/SQL, SSAS, SSRS, ETL, OLTP, OLAP, NoSQL, MS Visio 2016, MS Azure, Pig.

Data Analyst

Confidential, NY

Responsibilities:

  • Collaborated with various business stakeholders to create Business Requirement Document (BRD), translated gathered high-level requirements into a Functional Requirement Document (FRD) to assist implementation side SMEs and developers, along with data flow diagrams, user stories and use cases.
  • Designed and developed Use Cases, Activity Diagrams, Sequence Diagrams, OOD (Object oriented Design) using UML and Visio.
  • Assessed the pros and cons of migration and conveyed the same to the Business Analyst.
  • Used forward engineering approach for designing and creating databases for OLAP model.
  • Worked as a Tableau Desktop Developer focusing on developing high-end visualizations driven by data coming in from various data sources including Flat files, SQL Server and MS Excel.
  • Participated in all phases of Data mining, Data cleaning, Data collection, developing models, Validation, Visualization and Performed Gap analysis.
  • Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business.
  • Worked on logical and physical model using Erwin based on requirements.
  • Designed Star and Snowflake Data Models for Enterprise Data Warehouse using ERWIN.
  • Worked with DBA to create the physical model and database objects.
  • Normalized the database to put them into the 3NF of the data warehouse.
  • Assisted DBAs in the implementation of the data models.
  • Worked closely with ETL team in loading and mapping the data.
  • Created Source-to-target (S2T) mapping document as part of Data Analysis.
  • Built Tabular model cubes using SSAS involving dimensions and fact tables
  • Involved in day-to- day maintenance and solved any issues related to reports.
  • Used advanced data visualization and representation techniques in Tableau to provide an easy to understand interface for end users to quickly identify key areas within their data.
  • Created and maintained Database Objects (Tables, Views, Indexes, Partitions, Database triggers)
  • Dealt with different data sources ranging from flat files, Excel, Oracle, and SQL Server.
  • Developed Tableau visualizations and dashboards using Tableau Desktop.
  • Experience in Project development and coordination with onshore-offshore ETL/BI developers & Business Analysts.
  • Worked with Machine learning algorithms like Linear Regressions (linear, logistic etc.) SVMs, Decision trees for classification of groups and analyzing most significant variables such as FTE, Waiting times of purchase orders and Capacities available and applied process improvement techniques.
  • Generated ad-hoc SQL queries using joins, database connections, and transformation rules to profile data from Oracle and SQL Server database systems.
  • Strong Experience in conducting User Acceptance Testing (UAT), Unit Testing and documenting Test Cases and Test Scripts.
  • Communicating with the project team throughout all stages of design, managing time effectively, and work on project timelines simultaneously in demanding deadline driven environment.

Environment: CA Erwin 7, Oracle10g, UNIX, MS Excel 2007, SQL server 8, Tableau, HP Quality Center 10.

Data Analyst

Confidential

Responsibilities:

  • Built a website for Strategic Pricing Department using Adobe Experience Manager
  • Helped the organization save $748k per year from building enhanced website.
  • Data visualization using Charts in Think-cell/Excel/Tableau (column chart, pie chart, stacked chart, waterfall chart, bubble chart, scatter chart, line chart, pareto chart etc.)
  • Successfully implemented Customer Driven Design Management Technique
  • Designed a website to improve communication across and within multiple departments
  • Performed comprehensive survey analysis on the survey results
  • Worked on Clustering and factor analysis for classification of data using machine learning algorithms.
  • Performed Decision Tree Analysis and Random forests for strategic planning and forecasting and manipulating and cleaning data using dplyr and tidyr packages in Python .
  • Designed an ISO template to better organize information inside the department
  • Demonstrated the website in front of SVP, VP, Directors and Managers
  • Learned complex operations of the pricing department and contract management department

Data Services/ Data Analytics

Confidential, Franklin, TN

Responsibilities:

  • Designed and demonstrated interactive Power BI dashboards for the end customers
  • Used real world data to build a dashboard for a pharmaceutical retail company
  • Ran complex queries in SQL server management studio on end user data
  • Participated in all phases of data mining, data collection, data cleaning, developing models, validation, and visualization and performed Gap analysis
  • Cleaned and sorted big data in SQL to be imported into Power BI dashboards
  • Used DAX queries and internal Power BI commands to make dashboards attractive
  • Utilized Alteryx to join two SQL tables and organize the Alteryx flow
  • Kept daily progress updated on team projects using Microsoft Azure
  • Contributed to Web scraping and forecasting projects based on real world data.

Technical Sales Engineer

Confidential

Responsibilities:

  • Helped clients choose products and quoted the price
  • Created sales reports by collecting, analyzing, and summarizing sales information
  • Prepared Bill of Materials and Cost of Goods to be sent to production department
  • Performed successful test run of assembled CNC machine
  • Organized and delivered technical presentations explaining products or services to customers

We'd love your feedback!