We provide IT Staff Augmentation Services!

Gcp Data Engineer Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • ITIC Engineer, with more than 7+ years of experience as a technology consultant designing IT solution for sectors such as retail, bank, big warehouses, and government.
  • More than 3+ years as a GCP Data engineer for large cloud and on - premises platforms around the LATAM region.
  • Certified by Google like Professional Data Engineer since April of 2021.
  • More than 3 years working with cloud environments, like Google Cloud: Big Query, Cloud SQL, Cloud Storage, Cloud SDK, Compute Engine, App Engine, Firebase, Cloud APIs, and other tools like Matillion, Dataflow, Dataproc, Dataprep, BQ-ML, DataStudio, Federated Queries, Data Fusion, Pub Sub, Spanner, IAM Security, Service Data Transfer, VPC Configuration, Data Catalog. VPN Google-Client etc.
  • Strong knowledge of BI environments in banks and retail clients like World Bank, Interamerican Development Bank in united states and another bank client like Banco Banorte, Afirme, Odessa in Mexico.
  • Strong Knowledge about of analytics tools like Tableau, Powerbi, SSRS, Cognos, Jasper reports and report portal.
  • Strong knowledge about of presentation of reports, paginated reports, dashboards, stories, KPIs, Graphs and analytical components.
  • Strong knowledge of Microsoft BI suite like SSIS and SSAS creating ETLs, cubes, multidimensional and tabular models to on-premises and cloud target.
  • My goal within companies is collect in an exhaustive and efficient way the client requirements to after design the best architecture to the project and implement a solution that provides the customer with security, comfort, efficiency, and profitability.
  • Since 2018 I have been in charge of the bi department of a IT consulting company worldwide
  • Experience in agile methodology and tool like scrum, certified with Scrum Fundamentals.
  • Experience with another kinds of tools like IBM suite DataStage, QualityStage, Cognos and others.
  • Strong experience to model data with snowflake, star or denormalized model.
  • Strong experience with SQL (Sequential Query Language) with several RDMS like MS-SQL Server all versions, Oracle 11g, Oracle 12c, DB2 10.5,11.1 and cloud, MySQL, PostgreSQL.
  • Strong experience with CTE queries, Windows queries, pivot queries, views, materialized views and temporal tables, adding store procedures, triggers and another objects.
  • Experience with Python, Spark and other tools like Java and C#
  • Experience with Nifi and airflow to create pipelines of workflows of data between environments.
  • Experience with DevOps culture and his tools like Jenkins.

TECHNICAL SKILLS:

GCP: Google Cloud Big Query, Cloud SQL, Cloud Storage, Cloud SDK, Compute Engine, App Engine, Firebase, Cloud APIs, and other tools like Matillion, Dataflow, Dataproc, Dataprep, Bq-ML, DataStudio, Federated Queries, Data Fusion, Pub Sub, Spanner, IAM Security, Service Data Transfer, VPC Configuration, Data Catalog. VPN Google-Client

Microsoft BI suite: SSIS, SSAS, SSRS, Azure

IBM suite: DataStage, Quality Stage, Cognos, Pak DataStage on cloud, Netezza, DB2 DWH

Programming Languages: Java JDK, C#, PYTHON, PHP, R and learning Scala

Application/Web Servers: Tomcat, IIS, Jenkins.

Sql: SQL, PL/SQL DB2, PL/SQL Oracle, Transact SQL. CTEs, Views, Materialized Views, Store procedures, Windows queries, Triggers.

Methodologies: Agile, Scrum, Waterfall, DevOps.

Modeling Tools: UML, Visio

Testing Technologies/Tools: Jenkins

Database Servers: Oracle 10g/11G/12c, SQL Server 2000/2005/2008/2012/2014/2017 , MySQL, PostgreSQL, Aurora, Big Query, Cloud SQL, Spanner, DB 2 10.5/11.1/12/ cloud

Version Control CVS, SVN, GIT

Platforms: Windows 2000/98/95/NT4.0, UNIX, Windows 7,8,10, MacOS, Linux Debian/RedHat/Fedora/AIX/Ubuntu, Windows Server 2003,2008,2012,2016,2019

PROFESSIONAL EXPERIENCE

Confidential

GCP Data Engineer

Responsibilities:

  • Collect requirements.
  • Designing, planning, and presenting the architecture for Confidential .
  • Set up in the cloud of the environment to Confidential .
  • Follow Agile methodology process.
  • Speech to text specialist for Speech Suites solution.
  • Create the architecture to transactional data in cloud sql (MySQL).
  • Create the ETL with Airflow
  • Create the architecture to DWH data in BQ
  • Create models with the data to solve problems.

Environment: GCP, Big Query, Cloud SQL, Cloud Storage, Airflow, BQ-ML, DataStudio, MySQL, Git, Atlassian, Natural Languages, Windows server, python, shell scripts, IAM Security, Service Data Transfer, VPC Configuration, Data Catalog. VPN Google-Client

Confidential

GCP Data Engineer Sr and Leader

Responsibilities:

  • Collect requirements.
  • Designing, planning, and presenting the architecture for Confidential .
  • Lead the design and implementation using Agile methodology process.
  • Supervising the different engineers, during the phases of the deliveries: Design, development and QA.
  • Big Data processing for information analysis using Big Query platform. Identification and improvement and error correction processes as well as user behavior analysis. These data are key for new releases preplanning.
  • Several Models were created to avoid the loss of clients and sales.

Environment: & Tools: GCP, Big Query, Cloud SQL, Cloud Storage, Matillion, Dataprep, BQ-ML, DataStudio, Netezza, MySQL, Jenkins, Git, Atlassian, Red hat, Windows server, python, shell scripts, Federated Queries, IAM Security, Service Data Transfer, VPC Configuration, Data Catalog. VPN Google-Client.

Confidential

GCP Data Engineer

Responsibilities:

  • Collect requirements.
  • Designing, planning, and presenting the architecture for Confidential .
  • Set up the environment on the cloud.
  • Follow Agile methodology process.
  • Create the transactional data base in cloud sql (MySQL).
  • Create the DWH data base in BQ.
  • Create the ETL with Dataflow and Matillion to extract, transform and load the data since cloud sql to BQ.
  • Create the business logic of the solution.
  • Create some Data models to solve decision problems.

Environment: GCP, Big Query, Cloud SQL, Cloud Storage, Matillion, BQ-ML, DataStudio, MySQL, Federated Queries, IAM Security, Service Data Transfer, python, shell scripts, Federated Queries, VPC Configuration, Data Catalog. VPN Google-Client, Pub Sub.

Confidential

GCP Data Engineer and MS-Consultant

Responsibilities:

  • Collect requirements.
  • Designing, planning, and presenting the architecture for Confidential .
  • Set up the environment on the cloud.
  • Support and Maintenance to the cloud objects.
  • Support and Maintenance to the on-premises objects.
  • Follow Agile methodology process.
  • Create the transactional data base in cloud sql (MySQL) to several environments like QA and Development.
  • Create the DWH data base in BQ to several environments like QA and Development.
  • Create the staging environment to cleansing and transform the data.
  • Create the ETL with Dataflow and Matillion to extract, transform and load the data since cloud sql to BQ.
  • Create the business logic of the solution.
  • Create some Data models to solve decision problems.
  • Create dashboards and reports with DataStudio, SSRS and Powerbi.

Environment: GCP, Big Query, Cloud SQL, Cloud Storage, Matillion, BQ-ML, DataStudio, MySQL, MS-SQL, ORACLE, DB2, MAESTRO FINANCIERO, Federated Queries, IAM Security, Service Data Transfer, python, shell scripts, Federated Queries, VPC Configuration, Data Catalog. VPN Google-Client, Pub Sub, SSIS, SSAS, SSRS, DATASTAGE, QUALITYSTAGE.

CONAFOR

GCP Data Engineer

Responsibilities:

  • Collect requirements.
  • Designing, planning, and presenting the architecture for CONAFOR (fire department, health department, payment department and hr department ).
  • Set up the environment on the cloud and on-premise.
  • Support and Maintenance to the cloud objects.
  • Support and Maintenance to the on-premises objects.
  • Follow Agile methodology process.
  • Manage the data.
  • DBA administrator (MS-SQL, Oracle, DB2)
  • Create cubes and tabular models with SSAS and DataStage.
  • Support and maintenance to databases.
  • Create the DWH data base in BQ to several environments like QA and Development.
  • Manage and support of Cloud SQL (SQL Server)
  • Create the ETL with SSIS to extract, transform and load the data since oracle to ms-sql.
  • Create the ETL with Dataflow and Matillion to extract, transform and load the data since cloud sql to BQ.
  • Manage the cloud environment.
  • Create buckets in cloud storage to solve repositories problems.
  • Create the business logic of the solution.
  • Create some Data models to solve decision problems.
  • Create dashboards and reports with Jasper reports, Tableau, SSRS and Powerbi.

Environment: & Tools: GCP, Big Query, Cloud SQL, Cloud Storage, Matillion, DataStudio, MySQL, MS-SQL, ORACLE, DB2, Federated Queries, IAM Security, Service Data Transfer, python, shell scriptsVPC Configuration, Data Catalog. VPN Google-Client, Pub Sub, SSIS, SSAS, SSRS, DATASTAGE, QUALITYSTAGE.

Confidential

BI CONSULTANT

Responsibilities:

  • Collect requirements.
  • Designing, planning, and presenting the mobile and web architecture for Vinos America.
  • Development of mobile platform with Windows CE.
  • Management of data in SQL CE.
  • Development of web platform in ASP.
  • Management of data in SQL server.
  • Create reports with SSRS.
  • Create reports with qlik view.
  • Create the logic of entrance and exit of products with handhelds.
  • Create the logic of Confidential in real time.

We'd love your feedback!