We provide IT Staff Augmentation Services!

Operational Data Analyst Resume

Laguna Hills, CaliforniA

PROFESSIONAL SUMMARY:

  • Business Data Analyst with over 3.5+ years of professional and technical experience, including Graduate assistantship
  • Substantial understanding with Big Data technologies and data analysis, data visualization tools like Python scientific libraries
  • Experienced in all facets of the Software Development Life Cycle using Waterfall and Agile/Scrum methodologies
  • Proficient and worked with databases like Oracle, SQL Server, XML, Excel sheets, Flat Files and Netezza
  • Adept in extracted data from multiple sources, manipulating data/data validation and conducting Root Cause Analysis, developing business/operational solutions/recommendations and presenting to Sr. Leadership
  • Have extensively worked in developing ETL program for supporting Data Extraction, transformations and loading
  • Good understanding of Data Warehousing concepts like Star and Snowflake Schema, SCD Type1/Type2, Fact and Dimension
  • Efficient in Normalization/Denormalization techniques for optimum performance in OLTP and OLAP environments
  • Proficient in writing complex SQL queries, stored procedures, database design, creating indexes, functions, triggers, and sub - queries
  • Solid organizational and project management skills, with the ability to work cross-functional environment
  • Demonstrate a willingness, interest, and aptitude to learn new technologies

SKILL:

Languages: MySQL, SQL, Teradata(familiar), Python, Java(familiar), Git Hub (familiar)

Data Analysis: Data Cleansing, Data Modeling, Data Mining, Database development, MDM

Business Intelligence: ETL, SSIS, Informatica(familiar), Oracle

Reporting Tools: Tableau, MS Excel, JIRA, MS Suite, MS Visio, MS SharePoint, MS Access

Frameworks and Toolkits: Spring, MVC, Numpy, SciPy, Pandas, Seaborn, SciKit-Learn, Matplotlib, Plotly

WORK HISTORY:

Operational Data Analyst

Confidential, Laguna Hills, California

Environment: SQL Server Management Studio, Python, Tableau, Microsoft suite, Notepad++

Responsibilities:

  • Manipulating, cleansing & processing data using Excel, Notepad++ and Python, Responsible for loading, extracting and validation of client data
  • Worked on data analysis, data profiling, source to target mapping, Data specification document for the conversion process.
  • Ensured data integrity, data validity, data accuracy through the appropriate use of de-duping, loading and exporting tools, for the bulk of structured/unstructured data
  • Identify source systems, their connectivity, related tables and fields and ensure data suitably for mapping.
  • Designed and implement new ETL processes and adapting current ETL processes to accommodate changes in source systems/new business user requirements SSIS
  • Performed unit testing at various levels of the ETL and actively involved in team code reviews
  • Fixed the invalid mappings and troubleshoot the technical problems of the database
  • Involved in generating ad-hoc reports and maintain KPIs to track pipeline/stages for management visibility for the business trends and also generated solution driven dashboards using Tableau with Team

Graduate Research Assistant

Confidential, Riverside, California

Environment: Python, Jupyter Notebook, Microsoft suite, ETL, Oracle, Tableau

Responsibilities:

  • Work closely with Professor for data analyzing on large data sets (Structured/Unstructured) for almost 13 airlines and 322 airports across US using Python libraries like Numpy, Pandas
  • Collected historical data and third party data from different data source
  • Worked on data cleaning and ensured data quality, consistency, integrity using Pandas, Numpy
  • Worked on outliers identification with box-plot, K-means clustering using Pandas, Numpy
  • Participated in features engineering such as feature intersection generating, feature normalize and Label encoding with Scikit-learn preprocessing
  • Collaborated with the DS team for data visualization using seaborn, SciKit-Learn to select data like weekend delay, origin airport delay
  • Created ad-hoc reports and dashboards using Excel, MS Visio, Tableau
  • Build a few predictive models like polynomial regression model, identifying key variables, predicting behavior using python scientific libraries and machine learning techniques with almost 70% accuracy
  • Experience with Data Analytics, Data Reporting, Ad-hoc Reporting, Graphs, Scales, PivotTables and OLAP reporting

Data Analyst

Confidential

Environment: Informatica, Excel, Tableau, MS Suite, MySQL, Data Analysis, Oracle

Responsibilities:

  • Extracted, manipulated, and summarized data records in a meaningful way that clearly communicates data quality to key stakeholders
  • Executed SQL Query from relational databases and performed data validation, analysis and modeling (Root Cause & Behavioral Analysis)
  • Responsible for Logical data modeling, Dimensional modeling, Reporting & analysis, Data quality, Data cleansing, Master Data Management (MDM) and data mart
  • Defined and refined data quality checks and data monitoring supporting the business rules and changes, defines and refines data quality measures reports and dashboards
  • Implement process and logic to extract, transform and distribute data across one or more data stores
  • ETL development using Informatica in Microsoft SQL Server environment and all aspects of data warehouse reporting
  • Troubleshoot and resolve data, system issues and performance issues.

Hire Now