We provide IT Staff Augmentation Services!

Data Analyst Resume

Irving, TX

SUMMARY

  • Resourceful, Result driven IT professional with 6 years of professional experience in Data Analytics, Business Intelligence, Statistical Modeling, Data Mining, Predictive Modelling, Data Visualization, and Data Architecture.
  • Extensive hands - on experience and high proficiency with structures, semi-structured and unstructured data, using a broad range of data analytics techniques and big data tools including Python, SQL, R, Spark and Hadoop Map Reduce.
  • Performed Data analysis and Data profiling using SQL on various sources systems including Oracle and Teradata.
  • Excellent SQL programming skills and developed Stored Procedures, Triggers, Functions, Packages using SQL, PL/SQL.
  • Extensively used SQL for accessing and manipulating database systems. Write SQL queries to update the database and extract the data for analysis.
  • Performed Data manipulation, Data Cleaning, and Data Transformation for loading and extraction with Python libraries such as NumPy, SciPy, Pandas, Matplotlib, and Scikit learn for data analysis and numerical computations.
  • Extensive Data analysis/Data reporting/System analysis experience in data processing environment using ETL tools.
  • Experience in Data warehousing, Data Architecture & Extraction, Transformation and loading (ETL) data from various sources into Data Warehouse and Data Marts using Informatica Power Center 10.x/9. x and SQL Server Integration Services (SSIS).
  • Experience in high performance data integration solutions using SSIS and Informatica to create packages for data extraction, transformation, and loading ETL, data validation, ETL data to Data Warehouse/Data marts.
  • Familiar in Performing Data analysis, statistical analysis, generating reports, listings and graphs using SAS Tools SAS/Base, SAS/Macros and SAS/Graph, SAS/SQL, SAS/Connect, SAS/Access.
  • Implemented business intelligence dashboards using Tableau and Power BI to produce different summary results based on business requirements.
  • Extensive knowledge in various reporting objects like Facts, Attributes, Hierarchies, Transformations, filters, prompts, Calculated fields, Sets, Groups, Parameters etc.
  • Experienced in creating Quick Table Calculations, Static and Dynamic sets, Parameters and Level of Detail using advanced logical and statistical functions in Tableau and Power BI.
  • Hands-on Experience in MDX Expressions, DAX Expressions, Power BI, Power Pivot, Power integrated with Share Point and in creating dashboards in Power BI.
  • Experience in importing and exporting data from RDBMS to HDFS, Hive tables and HBase by using Sqoop.
  • Experience working with Agile and Waterfall data modeling methodologies.
  • Excellent technical, oral, communicational and analytical skills. Admirable interpersonal abilities and an efficient team player.

TECHNICAL SKILLS

Programming: SQL, Python, R

Databases: Oracle, SQL Server, MySQL, DB2, Teradata, PL SQL

Business Intelligence Tools: Tableau, SQL Server Reporting Services (SSRS), Crystal Reports, Business Objects, Micro StrategyPower BI

Statistical Models: Decision Trees, Naive Bayes classification, Logistic Regression, Linear Regression, Neural NetworksSupport Vector Machines, Clustering Algorithms and PCA

Big Data Technologies: Apache Hadoop, Hive, Spark, Data Warehouse (Snowflake)

Data Warehousing: SSIS, Informatica (Power Center/ Power Mart), Data Mining, DataMart, OLAPOLTP, Data Profiler, IBM Infosphere

PROFESSIONAL EXPERIENCE

Confidential, Irving, TX

Data Analyst

Responsibilities:

  • Working with the Marketing and Revenue Analytics team to identify the business needs, understand strategic plans and provide analytical solutions for offline and online campaigns.
  • Involved in creating content-based recommendation system for different product lines using Association rule and Collaborative filtering for CVS retail products.
  • Developed large scale data structures and pipelines to organize, collect and standardize data using SQL and Python which helps in generating insights and address reporting needs.
  • Performed data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter the product data.
  • Performed feature engineering, train the algorithms, back-test models, compare model performances using Python packages like Pandas, Numpy, Scipy etc.
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and formatted the results into reports and kept logs.
  • Performed statistical analytical like Correlations analysis, ANOVA, t-tests, Market Basket Analysis, etc. to segment the customer group.
  • Built Factor Analysis and Cluster Analysis models using Python SciPy Package to classify customers into different target groups.
  • Applied complicated marketing segmentation algorithm to select target customer prospects for mailing.
  • Gathered requirements and modelled the data warehouse and the underlying transactional database.
  • Designed and developed the logical and physical data models to support the Data Marts and the Data Warehouse.
  • Assessing existing data, identifying data gaps, and collaborating with Data Strategy and business partners to develop approaches to closing the data gaps.
  • Worked with large data sources, administrative claims and clinical program data, focusing on efficient data extraction from large databases, data manipulation, and set up optimized summary data layers.
  • Designed ETL processes to load prospects to data marts.
  • Worked on data profiling & various data quality rules development using Informatica Data Quality.
  • Implemented Exception Handling Mappings by using Data Quality, Data validation by using Informatica Analyst.
  • Maintained and enhanced existing SAS reporting procedures for churn reporting and marketing campaigns.
  • Scripted SAS programs with the use of SAS/Base and SAS/Macros for transferring and converting data character to numeric and numeric to character from Excel files to another to be used for further analysis and created global and local variables.
  • Implemented business intelligence dashboards using Tableau producing different summary results based on requirements and role members.
  • Created Tableau Score cards, Dashboards using stack bars, bar graphs, scatter plots, geographical maps and Gantt charts.
  • Used Tableau to enhance business understanding through simple, and concise data storytelling.
  • Leverage advanced Tableau functionality (parameters, actions, tooltip modifications, LODS, API, etc.) to create analytical and complex interactive dashboards.
  • Developed different types of Tableau Reports, working at the data model to fit tableau needs and discovering data by tableau to best answer the business requirements.
  • Created Tableau views with complex calculations and hierarchies making it possible to analyze and obtain insights into large data sets.
  • Work closely with other analytical consultants, data scientists, data engineers, visualization specialists to drive end-to-end data gathering, data automation, predictive modeling, code maintenance, visualization and reporting.

Environment: MySQL, Teradata, Tableau, Python, SQL, SAS, Excel, SQL Workbench, BI, Informatica, ETL, Data Warehouse, AWS, S3, UAT

Confidential

Data Analyst

Responsibilities:

  • Collaborated with business team for data initiatives, with focus on the use of data to optimize business KPIs such as revenue and circulation, along with the team of data professionals with specific focus on: Analytics & Insight, Data Engineering and Data Science.
  • Working with Data Scientist and developed predictive models, forecasts and analyses to turn data into actionable solutions for Confidential insurance products.
  • Managed the conversion of client data from various record-keeping systems into Confidential Retirement's proprietary systems.
  • Participated in all phases of data mining, data collection, data cleaning, developing models, validation, and visualization and performed Gap analysis using Python .
  • Used statistical packages for analyzing datasets like NumPy, Pandas, matplotlib, etc.
  • Performed Data Analysis and Data validation by writing complex SQL queries using TOAD against the ORACLE database.
  • Experience in writing SQL Queries for pulling large volume of records from Facets and claims database using stored procedures, and Extraction Transformation and Loading ETL process using SSIS.
  • Created programs in Python for automating the processes for creating Excel sheet reading data from Redshift databases.
  • Worked with the ETL team to document the Transformation Rules for Data Migration from OLTP to Warehouse Environment for reporting purposes.
  • Performed Tuning of Stored Procedures and SQL queries using SQL Profiler and Index Tuning Wizard in SSIS.
  • Extensively used SQL, T-SQL and PL/SQL to write stored procedures, functions, packages and triggers .
  • Designed and developed various SSIS packages (ETL) to extract and transform data and involved in Scheduling SSIS Packages.
  • Developed SQL scripts for creating tables, Sequences, Triggers, views and materialized views .
  • Generated various kinds of reports, dashboards, and visualizations using Power BI based on reporting requirements.
  • Modified datasets using DAX query, Merge statements, Refresh Schedule, Row level security and Modelling within Power BI.
  • Implemented Drilldowns, Drill throughs, and Bookmarks in Power BI to make the dashboard effective and clear understanding.
  • To ensure data was matching as per the business requirements, designed and deployed with Drill Down and Drop-down menu option and Parameterized and Linked reports using Power BI.
  • Supported applications using Ticket Management System (TMS) called Jira.
  • Used the Waterfall methodology to build the different phases of Software development life cycle.

Environment: Python, SQL, Power BI, Excel, Oracle/ DB2, Teradata, Jira, BTEQ, TOAD, PL/SQL, Teradata SQL Assistant, UAT

Confidential, Plano, TX

Data Analyst

Responsibilities:

  • Worked with teams to identify/ construct relations among various parameters in analyzing customer response data for credit card and loan transactions.
  • Participated in all phases of data mining; data collection, data cleaning, developing models, validation, visualization and performed Gap analysis.
  • Performed hands-on data exploration and modeling work on massive loan and credit card data sets.
  • Used Python scripts to update content in the database and manipulate files.
  • Extensively used python libraries like NumPy, Pandas, SciPy for data wrangling and analysis, while visualization libraries of Python like Seaborn, Matplotlib for graphs plotting.
  • Used Inner Join, and Outer join while creating SQL Queries from multiple tables for transactions data.
  • Creating complex SQL queries and scripts to extract and aggregate data to validate the accuracy of the data.
  • Delivered Enterprise Data Governance, Data Quality, Metadata, and ETL Informatica solution.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Created mappings using Designer and extracted data from various sources, transformed data according to the requirement.
  • Developed Informatica Mappings and Reusable Transformations to facilitate timely Loading of Data of a star schema.
  • Used Tableau to conduct analysis by using features such as parameters and calculation field. And build up visualization reports based on the analysis.
  • Designed and developed operational BI dashboards, and metrics analyses/queries using Tableau.
  • Reviewed and optimized SQL queries and edited inner, left, and right joins in Tableau by connecting live/dynamic and static datasets.
  • Designed and created innovative and valuable visualizations in Tableau to track KPI's, metrics and other key data points.

Environment: Python, SQL, Tableau, Oracle/ DB2, PL/SQL, Teradata SQL Assistant, ETL, Informatica, Excel, Hive, Hadoop

Hire Now