We provide IT Staff Augmentation Services!

Sr. Backend Data Engineer Resume

2.00/5 (Submit Your Rating)

Redmond, WA

SUMMARY

  • Highly accomplished professional with a distinguished 10+ year career spanning across multi - industry Supply Chain, Restaurant, Cosmetics, Banking, Entertainment and Technology specializing in Data Engineering, Data Warehousing, Business Intelligence, Reporting & Data Analytics with more than 3 years in leadership role.
  • Hands on experience in supporting, stabilizing, and optimizing exabytes of Datawarehouse applications in complex production environments.
  • Experience working with business owners, stakeholders, and external vendors to solve complex application and integration related issues across multiple functional areas.
  • Ability to manage multiple assignments, including assessing risks, developing work plans, prioritizing, and scheduling work activities to support concurrent projects.
  • Proficient in the systems design and development process, including requirements analysis, impact analysis, pilot testing, installation, evaluation, and operational management.
  • Hands on experience in systems analysis, impact analysis and data integration involving multiple source systems.
  • Hands on experience in writing backend Services using Confidential .Net Core, writing Unit/Integration Testing frameworks and writing Server-less code using Azure Functions.
  • Hands on experience in leading cloud/Bigdata technologies like Confidential Azure and AWS - Cosmos®, Kusto/Application Insights Analytics, Scope®, Amazon Redshift, Airflow, Amazon EC2, Amazon S3, Oracle, SQL Server and ETL tools like Talend and SQL Server Integration Services; working knowledge of Informatica, Data Stage, Hadoop, Spark.
  • Hands on experience in ingesting heterogeneous source systems such as NoSQL endpoints (JSON, XML, etc.), RDBMS (SQL Server, Oracle, DB2, etc.), MPP (Azure SQL DW, Amazon Redshift, etc.), Cloud Data ( Confidential Azure, Cosmos, Spark, HDFS, Hive, Pig, Azure SQL, Azure Storage/BLOB, Cassandra, S3, etc.).
  • Extensive experience in 24x7 production support and secondary escalation point to provide support for a multi-terabyte enterprise Datawarehouse.

TECHNICAL SKILLS

Languages/Scripting: C#, Java, Python, Bash & Power Shell

Backend: Net Core 3.1

Frontend: Angular

Databases: Oracle 11g, SQL Server, Amazon Redshift, IBM Netezza

NoSQL: DynamoDB, CosmosDB, Cassandra

Compute: Azure Compute, Amazon EC2, SANSA-Stack and Spark

Analytics: SQL Server Analysis Services, Kusto, Python Pandas

Big Data: AZURE, AWS, Hadoop, Hive, COSMOS

ETL: SSIS, Informatica, Talend, Azure Lens, Data Studio (CLOUD ES), SANGAM

CI/CD: Azure Pipelines, Jenkins

Scheduling tool: Airflow, Hangfire, Apache Oozie, Amazon ADP, BMC Control-M

Reporting: Power BI, Tableau, Jupyter/ Zeppelin Notebook, Matplotlib/PyPlot, SSRS

ERP: SAP R/3 and Confidential dynamics AX

Code Management: GIT, TFS, VSS, SUBVERSION

Operating Systems: Mac, LINUX/UNIX, Windows

PROFESSIONAL EXPERIENCE

Confidential, Redmond, WA

Sr. Backend Data Engineer

Responsibilities:

  • Added Service Components for Signal data telemetry for the N-Layer Azure Global Cascade Application using .NET Core, SQL Server and Kusto.
  • Wrote Sync Services to capture various telemetry datasets, cleanse and publish on Kusto tables.
  • Designed and maintained Enterprise ETL data pipelines utilizing Hangfire® on .NET Core and Data studio workflow orchestrators with custom reducers written in Azure Data Explorer (KUSTO).
  • Implemented Azure functions for KPI Override data sync for M2, M3 and M4 KPIs and Service Layer sync for Service Tree API.
  • Designed and developed M2, M3 and M4 KPIs powering JEV and Ev2 extension adoption across Azure org for JEDI scoped services using Kusto and Power BI.
  • Implemented Cross Cloud egress and import for various Signals that power EV2 adoption progress for Service Teams using Air gapped to Public Kusto transfers.
  • Supported analytical needs of Cascade PM group using DAX calculations and a deep understanding of SSAS tabular model.
  • Implemented Data studio to Hangfire migration for several backend entities to provide efficiency and bring better SLA timelines for Cascade PM group.

Confidential, Seattle, WA

Sr. Data Engineer

Responsibilities:

  • At a Senior Data Engineer capacity at Confidential ’s Analytics Data Engineering group, supported the Analytical needs of Confidential Re-Ops and Analytics Data Engineering Teams in AWS ecosystem (AWS EC2, Redshift, DynamoDB, Lambda, etc.) and Airflow.
  • Established close collaboration with Confidential ’s Data Scientists, SMEs, Product Managers and DevOps to create long lasting and high impact analytical data applications on AWS ecosystem with Redshift, DynamoDB, Lambda and Data Pipeline Scheduling using Airflow Docker instances.
  • Architected big data workflows for integrating large volume Confidential timeseries data to drive deep data insights for Analytics, ReOps, Marketing and Mortgage systems.
  • Lead contributor and SME for all the Pay channels and Systems supporting entire US Agent grid.
  • KPI and Inferential Metric Collection powered by Machine Learning and Statistical Models for Re-Ops drivers.
  • Operational Support and quick RCA tooling and resolution for existing systems with extensive on-call weekly rotation using Pager Duty for ADE’s ETL pipelines to guarantee Transaction consistency for Confidential ’s real estate transactions.

Confidential, Redmond, WA

Sr. Cloud Data Engineer/Architect

Responsibilities:

  • Built Azure Global Analytical Application using Confidential ’s cloud ecosystem COSMOS®, SQL Server Tabular model and Confidential Power BI, consolidating structured, semi-structured and raw data into schema-bound aggregated value-added information.
  • Designed and maintained fault tolerant and incrementally adaptive Enterprise ETL data pipelines utilizing Sangam® and Data studio workflow orchestrators with custom reducers written in C#/.NET, SQL, SCOPE/U-SQL, Azure Data Explorer (KUSTO) and PowerShell.
  • Implemented Azure functions for outputting Privacy patterns and C+AI’s GDPR compliance progress using real time telemetry collection.
  • Well documented E2E Enterprise KPI design and development for proprietary S360 platform with near real time compliance feed reduction using USQL and C# .Net.
  • Implemented well documented ingestion patterns for consuming structured and unstructured Network Health data for Blackforest and Mooncake cloud platforms into Power Metrics platform.
  • Automated and hosted Service On boarding application on Confidential Azure for 1300+ UST services consuming TFS API for MS Azure, Service Tree API, Azure Data Explorer (Kusto) API Authenticated with Azure Key vault.

We'd love your feedback!