We provide IT Staff Augmentation Services!

Sr Data Engineer Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Sr Data Engineer with Data Engineering, Data Modelling, Data Quality and Data Governance Expertise.

TECHNICAL SKILLS

  • Informatica (Power Center, IDQ), SQL Server 2017,Teradata, Power BI, DB2, JDBC, ODBC, MS BI (SSIS, SSAS, SSRS)
  • Tableau, Tidal Scheduler, Tumbleweed FTP, AZURE Data Factory, Hive, PySpark, Spark, HADOOP, XML
  • Azure Big Data, ER Studio, Undraleu, TFS, Git, SVN version control, Python, DPA, R Studio, C#
  • PoweShell, Linux\Unix, Azure DataBricks, Azure Kustos(DataExplorer), Azure Devops

PROFESSIONAL EXPERIENCE

Confidential

Sr Data Engineer

Responsibilities:

  • Developing Analytical Platform in Cloud to serve Enterprise Analytical needs of the Enterprise using Azure tools.
  • Using Azure data lake, Azure Data factory and Azure databricks to move and conform the data from on - premise to cloud to serve the analytical needs of the company.
  • Using Databricks to Load cureate layer from Raw layer with the help of deltalake.
  • Using Data Factory to load data from onpremise databases to Lake using intergration runtime.
  • Loading Data from Curted layer to Azure database for use of broder uses with Data Factory.
  • Calling Databricks within the Data Factory to perform various ELT tasks.
  • Started Loading Facts and Dimensions in ADW Database using Datafactory Dataflow.
  • Developed reusable standardized extract process for Enterprise to decrease the maintenance and to decrease the time to market using Azure tech stack.
  • Analyze problems to develop and design solutions involving analysis of large volumes of Relational and non-relational data in databases and various files (Flat files, xmls Etc.) to provide necessary insights and meaningful data for business users in order to make more efficient decisions to improve customer experience.
  • Designed relational database and developed different tables to load EHR data into BCBSNE databases from different health care providers.
  • Used Tidal Scheduler to automate ETL jobs execution.
  • Lead a sub group of other developers to design enterprise strategy to design reusable objects and extracts in order improve efficiency of data moment.
  • Created Scripts and FTP tools to manage files transfers from and to the vendors.
  • Creating complex queries to reconciliation the data between source and target with 100% accuracy.
  • Debugging and enhance the Informatica code for data discrepancies for high priority defect.
  • Involved in Performance tuning at source, target, mappings, sessions and system levels by which reduced the complex workflow run time.
  • Collaborate with data architects, BI architects and business analyst teams during data modelling sessions and requirement gathering.

Confidential, Tulsa, OK

Senior Database Developer/ ETL Developer

Responsibilities:

  • Reviewed the design documents and facilitated final design review meetings with the solution architects.
  • Development, testing and production moves of Informatica workflows as per the business needs.
  • Developed test plans and test scripts for unit and integration testing and validating them.
  • Performed data profiling activities to validate and confirm on expected transformation rules.
  • Analyzed data and validating data quality using Informatica IDQ tool. And provide root cause analysis to the business user.
  • Identified root causes of job failures, coordinating failure triages and working with operations team for a speedy resolution and achieving SLA.
  • Implemented reusable Informatica mapplet, common across the subject area which resulted in a considerable saving in time and effort.
  • Implemented solution for handling dynamic input file to update multiple tables in a single Informatica execution.
  • Implemented incremental logic, mapping/table partition, bulk load option, for performance tuning of existing Informatica mapping.
  • Actively involved in dynamic parameterization, audit control process and developing mapping as per mapping specification.
  • Responsible for migration of code to testing environment and defect fixing in code if any.
  • Used Workflow manager to create reusable sessions. Incorporated email tasks for failure alerts and successful notifications.
  • Worked closely with customers to gather requirements for the ICD10 Project and was responsible for creating the source requirement specification, Technical design document, and mapping document.

Confidential

Informatica Developer

Responsibilities:

  • As a part of GPD project, provided duplicate analysis which is used to track the details about the data movements of employee, contractors, that is, if a new employee joins Confidential then Confidential will capture information about the new employee and stores it. Similarly, if any employee leaves Confidential or moves to a new location, the application reflects the movement of the employee.
  • Match the employee’s details of Confidential from different system like GEMS and VTA with GPD data and flag out the difference in information and that can be correct at the various source system by comparing the data with GPD data and improve the quality of data in various systems
  • Repots shared through this is being used by Business Objects (BO) reporting tool for reporting the result to higher management
  • Development and requirement gathering.
  • Continuous result improvement on the quality of data matching
  • End to End testing of the designed interfaced
  • Customer interaction on a day to day basis to understand the requirements and provide technical solution.
  • Ensure a common understanding and set expectations through communication to align the stakeholders and team members.
  • Preparing and maintaining design and functional documents.
  • Ensuring ITIL processes - Change, Release and Incident are followed.
  • Implement application changes according to Change Management plan.
  • Understand the system and provide Continuous Improvement Ideas and work them through implementation.

We'd love your feedback!