We provide IT Staff Augmentation Services!

Data Engineer Resume

3.00/5 (Submit Your Rating)

San Diego, CA

SUMMARY

  • Over 12+ years of IT experience as Data Engineer. I’m proactive, I’m passionate about technology, I love working with fully collaborative teams, I’m constantly updating myself with the most current tools

TECHNICAL SKILLS

Databases: MSSQL, AzureSQL, Postgres,Snowflake, MySQL, Oracle, MongoDB

Cloud: Azure, AWS, Azure Data Factory, Databricks

Languages: Python, T - SQL, DAXETL and ELT SSIS, Pentaho, Talend, Informatica, AlteryxReporting Tools SSRS, PowerBI, TableauEducation

PROFESSIONAL EXPERIENCE

Confidential - San Diego, CA

Data Engineer

Responsibilities:

  • Design and Developing and Implementation of Client-Server Applications & Database Administrator systems using MSSQL 2000 to 2019
  • Proficient in optimizing a query, identifying missing indexes, index fragmentation, SQL Server Performance Tuning Scripting
  • Automation internal processing, Regression Modeling with proficiency in building Linear Regression, Multivariable regression and Logistic Regression for predictive analysis including the use of dummy variables and multivariable adjustment using Python.
  • Building and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
  • Snowflake experience Snowflake, SnowSQL, SnowPipe, integration with AWS and, Azure, Optimizer, Warehouse, Administration.
  • Excellent experience in creating Indexes, Indexed Views in observing Business Rules and creating effective Functions and appropriate Triggers to assist efficient data manipulation and data consistency.
  • Well experience in Data Extraction, Transforming, and Loading (ETL) using various tools such as SQL Server Integration Services (SSIS), Data Transformation Services (DTS).
  • Experience creating dynamic packages for Incremental Loads and Data Cleaning in Data Ware House using SSIS.
  • Experience in importing/exporting data between different sources like Oracle/Access/Excel etc. using SSIS/DTS utility.
  • Knowledge in manager Azure, Azure SQL.
  • Good experience of software development in Python (libraries used: Beautiful Soup, NumPy, ArgParse, SciPy, Matplotlib, Logger, Python-twitter, Pandas data frame, Network, urllib2, MySQL, MSSQL, Snowflake dB for database connectivity) and IDEs - sublime text, Jupyter Notebook and Vscode.
  • Experience in (with MDX and DAX) using(SSAS)
  • Extracted, transformed, and loaded data from various heterogeneous data sources and destinations like Access, Excel, CSV, Oracle, flat files using connectors, tasks, and transformations provided by SSIS.
  • Involved in creating Jobs, SQL Mail Agent, Alerts, and Schedule SSIS
  • High Availability and Disaster Recovery Planning
  • Azure, Azure Data Factory Pipelines CI/CD with integration with GitHub, Linked Services with Datasets, Triggers, Window Trigger and using dependency as well, Event Trigger, Integration Runtime, Parameterize Datasets, Linked Services, Union, Pipelines, Variables, Connectors Overview, Support Files Formats, Monitor, Sort, New Branch, Select Transformation, Pivot, Unpivot, Surrogate Key, Alter Row, Flatten, Schema, Merge Queries, Group By, Differrent Author, Devops, Json, Templates, Global Parameters, Rank, Sink, LogCopy, Wait, Web, WebHook, Mapping, Aggregate, Join, conditional Split, Deriveed Column, Exists,Data Flow, Switch, Validation, Lookup, Transfom, Store Procedure,Until, Delete Data Activity, Append, Properties, Filter, ForEach, Metada, If Condition

Confidential - San Diego - CA

Data Engineer

Responsibilities:

  • Pentaho BI Server, Pentaho Data Integration (PDI) SQL Loader, SQL Server 2000 to Last version, SQL Agent.
  • Worked with business users/Analytics team, data architects to identify the business requirements and developed designed document for ETL flow, analysis of database architecture, created various complexes Pentaho Transformations and Jobs using PDI Spoon.
  • Use Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others
  • Software professional with experience in creating OLTP and OLAP Data warehouse using Pentaho Suite (Pentaho Data Integration)
  • Installed and configured Pentaho Suite and tested the transformations using the same.
  • Used Pentaho Import Export utility to Migrate Pentaho Transformations and Job from one environment to others.
  • Prepared ETL (Extract, Transform and Load)standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart.
  • Ability to handleMultiple tasks,InitiativeandAdaptable.Self-motivated, organized team player with strong problem solving and analytical skills and total commitment to the organization goals.
  • Good ability to quickly grasp and master new concepts and technologies

Confidential, San Diego - CA

Data Engineer

Responsibilities:

  • Automation internal processing, Regression Modeling with proficiency in building Linear Regression, Multivariable regression and Logistic Regression for predictive analysis including the use of dummy variables and multivariable adjustment using Python.
  • Building and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
  • Active participation in the design of the new Emac Engineering and Office 365 Servers.
  • Equipment purchase and Service Negotiations IT contracts are provided, always seeking the best prices with security and appropriate technology to the company's solution.

We'd love your feedback!