We provide IT Staff Augmentation Services!

Data Integration Lead/architect Resume

4.00/5 (Submit Your Rating)

SUMMARY:

Technology leader with 12 years of experience in building data and analytics platforms, data management, and governance; currently leading technology refresh engagements involving data marketplace, data lakes, cloud databases, Python & Scala on Spark. Excellent communication skills enabling effective stakeholder communication & getting projects from chaos to steady state.

EXPERTISE:

  • Data Marketplace, Enterprise Data Lake
  • Modern Data Platform Solutioning and Provisioning, API Management, Big Data Capacity Planning
  • Data Pipelines, Metadata Management, Data Quality

TECHNICAL SKILLS:

Modern Data Platforms: Qlik Data Catalyst (Podium Data), Paxata, HIVE, Kafka, Informatica, Snowflake, Ranger, Sentry

Data Management: Collibra, Informatica Data Quality

Data Pipelines: Informatica, Kafka

EXPERIENCE:

Confidential

Data Integration Lead/Architect

Responsibilities:

  • Data Integration Lead dedicated to strategic technology partnerships, to architect and build modern data platforms for new clients, including change management.
  • Solutioned and provisioned a modern data platform and data marketplace, built on Spark and Cloudera, across 20 BUs for a P&C Insurer ranked #66 on Confidential 500.
  • Technical solutioning, process engineering, and governance framework for modern data platforms and enterprise data warehousing at mid - sized financial institutions.
  • Developed organizational competency on modern data platforms in cloud data warehousing, self-service analytics, data pipelines, and integrated security.

Confidential

Senior Consultant

Responsibilities:

  • Performed a data center migration project for a Confidential 500 financial services client. Responsible to provision and system test the major assets from a software/hardware migration standpoint.
  • Developed a Salesforce MDM solution CAD (Centralize Advisor Database) and load data using Informatica PowerCenter into Salesforce to remove the manual effort of managing Firm's advisor’s data.

Confidential

Software Engineer

Responsibilities:

  • ETL Informatica Developer on a Data Migration project for a Confidential 500 Insurance company.

Confidential

Hadoop Consultant

Responsibilities:

  • Provisioning & implementing Qlik Data Catalyst (QDC) enabling data ingestion from Kafka, Mainframe, SQL server into Data Lake
  • Build Data pipelines using QDC, WhereScape, Hive to create Landing, Staging, Conformed & Endpoint layers in the Data Lake
  • Qlik API Management, Ranger Integration to perform security integration for Hive, HDFS objects
  • Perform REST scripting to automate user’s group to role mapping for all Data Lake users, Hive/HDFS objects management
  • Create QDC, Collibra, DQ+ integration solutions for Data Governance using Python scripting
  • Currently setting up Spark environment to create data pipelines to load data into the Data Lake using Scala and Python scripting.

Confidential

Hadoop Consultant - Solution Architect, reporting to BI Manager

Responsibilities:

  • Create and present the Data Marketplace solution using Qlik Data Catalyst tool to OPERS CIO and ensure the consumption layer is designed is in sync with leadership vision.
  • Responsible for end to end provisioning and ingestion of organizational data, Mainframe files, JDBC, Sqoop to create a Data Lake or Data Marketplace using QDC tool.
  • Drive the project plan to complete the MVP (minimum viable product) assessment of the tool.
  • Work with Hadoop vendor to performance tune the cloudera cluster to meet data ingestion and processing needs.
  • Design security model for the Data Lake ensuring to create a well governed solution. Enable sentry on cloudera cluster to perform Authentication/Authorization of various cloudera services using Active Directory.
  • Automate technical tasks using the REST API interface of QDC tool, compute stats for all Impala tables using shell scripting.
  • Driving ingestion best practices for big data, mainframe files, JDBC sources using Sqoop.

Confidential

Technology Integration Lead

Responsibilities:

  • Partnered closely with Confidential ’s Enterprise Data Office, cross commits, vendors, business users, and performed end-to-end planning for rolling out new applications across the enterprise.
  • Provisioned Paxata (Spark and Hadoop based application), upgraded Paxata from 2.19 to 2.22 version, and upgraded IDQ from v9.1 to v10.1.
  • Created a Security Model for each asset and defined various processes including roles and privileges, onboarding new business units, and build to run transition.
  • Worked in an Agile environment, responsible for end-to-end delivery for provisioning new applications, performed upgrades, provided detailed estimates and tasks of various story cards and features, created detailed implementation plan, and managed stakeholder communications.
  • Maintained and onboarded new business units for Informatica Metadata Manager application and upgraded Informatica Data Quality from v9.6.1 to v10.2.
  • Worked with various cross commits specially Fiserv vendor to perform the end-to-end setup required for the batch interfaces.
  • Analyzed the SCV (Single Customer View) model and created high level designs to load the data.
  • Examined the current bank data mart architecture and performed source system data analysis for FDR, Signature, AODB, and prepared source system inventory document.
  • Collaborated with Fiserv source teams to understand the BI extracts and route the data to various downstream and load in stage, vault, and bank data mart.
  • Understood the ACH Fed files layouts and created the design to load the files in NOSS System.
  • Created the Fraud and AML reports out of ETL jobs as an alternate solution until FCRM is setup and running

Confidential

Project Lead, Production Support

Responsibilities:

  • Understood current data warehousing architecture and performed source system data analysis for FDR, ACBS, and Miser, and prepared source system inventory document.
  • Managed all scheduling needs for Informatica PowerCenter and MDM jobs to create complex schedules to meet project SLAs.
  • Worked directly with project teams to understand end-to-end implementation of the MDM solution and performed failover testing for MDM Cluster.
  • Collaborated with project teams to install, configure, and test the Resource Kit for MDM batch jobs scheduling.
  • Created detailed documentation from a governance standpoint to ensure project teams adhere to MDM jobs naming standards, batch jobs sequencing, and error handling.
  • Performed end-to-end analysis of current environment and recommend improvements to the current environment to accommodate future needs.
  • Worked with key business teams to retire, move, or restore various applications during merger/acquisition activities.

Confidential

Sr Consultant

Responsibilities:

  • Analyzed the DST REP and firm data use cases created to drive the data model and requirements.
  • Examined data across three systems (DST, Salesforce, and CSTAR) to come up with criteria for unique ID for advisors across the systems.
  • Designed and developed ETL and Linux components adhering client specific governance standards and architecture. es.
  • Executed SIT for ETL and .NET capabilities and coordinated causal analysis and defect remediation.
  • Ensured the correctness of data populated in the CAD (Centralized Advisor Database), a small MDM solution created as part of this effort.

We'd love your feedback!