We provide IT Staff Augmentation Services!

Data Architect Resume

2.00/5 (Submit Your Rating)

Phoenix, AZ

SUMMARY

  • Certified Informatica, Teradata, SAP HANA IT professional with 13+ years of well - honed experience in analyzing Business Requirements, Software Development.
  • Lead Data Architect for Data Integration module of Informatica end-to-end implementation and production support projects.
  • Proficiency in Cloud based ETL pipelines using Azure Data Factory, Apache Airflow, Azure DataBricks, python and Spark.
  • Well versed with Designing, architecting, and implementing end to end Cloud Data warehouse solution with Snowflake.
  • Well versed with Snowflake concepts like Time travel, Data Sharing, SnowPipe, SnowSql.
  • Experience using the Apache Structured Streaming API for processing Streaming Data.
  • Experience on working with CICD platform GitHub.
  • Experience in Designing, coding, and supporting data intensive systems at scale.
  • Expertise in designing and architecting ETL solutions with Informatica PowerCenter and Developer client.
  • Implement Data virtualization solution using Denodo.
  • 6+ years of Unix Shell Scripting.
  • Expertise in ETL solution dealing with different database involving Teradata, Oracle, SqlServer, Salesforce, and SAP R/3.
  • Experience in Dimensional modeling for DW using Erwin.
  • Well versed with different visualization tools like IBM Cognos, PowerBI.
  • 6+ years of ETL experience in implementation of SAP-ERP project.
  • Well versed with consuming data using new file formats such as parquet, json.
  • Strong understanding of stored procedures, database functions, indexes and query plan information to troubleshoot performance related issues.
  • Experience of project execution using Software Development Life-Cycle Processes.
  • Excellent Team Player with effective time management skills, ability to prioritize, delegate and meet deadlines.

TECHNICAL SKILLS

ETL Tools: Informatica, Azure DataBricks, Denodo, SAP BODS

Database: Oracle, Teradata, SqlServer, SAP HANA, Snowflake

Scripting: Unix, python

Visualization Tools: Cognos 11, PowerBI

Scheduling Tools: Control-M

Quality Assurance: HPQC, HP Service Manager

PROFESSIONAL EXPERIENCE

Data Architect

Confidential, Phoenix, AZ

Responsibilities:

  • Analysis and Solution design for migrating on-prem Oracle database to Cloud Datawarehouse Snowflake.
  • Extract Oracle tables in parquet files using Informatica Developer and upload to ADLS with AZcopy
  • Create external Snowflake Stage pointing to ADLS.
  • Design data model and implement database structures in SnowFlake.
  • Define transformation rules and Build stored procedures to transform and load daily interval and aggregate data into Snowflake, audit table needed for reconciliation.
  • Schedule the process to load the files via Snowflake tasks
  • Design PowerBI Reports Dashboard to show quality of Reads imported in MDMS from Head end Systems and data loaded in DataMart from MDMS.

Data Engineer

Confidential

Responsibilities:

  • Analysis for conversion of legacy Informatica processes to Azure DataBricks ETL pipelines.
  • Document existing transformation rules
  • Ingest daily files from ADLS into Azure Delta lake using python and spark to load in Azure Delta lake.
  • Load data in Raw, Certified and Analytical layers in Delta Lake.
  • Design data model and implement database structures in SnowFlake.
  • Merge data into SnowFlake Datawarehouse for Reporting needs.
  • Share data with Downstream application / third party vendors by unloading data in Azure Blob Storage.

Environment: Azure Databricks, python, Snowflake, PowerBI

Data Warehouse Architect

Confidential, Phoenix, AZ

Responsibilities:

  • Interact with users and understand Business requirements. Translate Business Requirements into System Requirements and create Technical Design Documents.
  • Design and Develop Optimized Solutions to load large volume Meter Read data, Solar Plant and Supply Chain data into Oracle DW.
  • Design and develop Data conversion and History load strategies.
  • Implement slowly changing Dimensions and code to handle late arriving dimensions.
  • Using Performance improvement techniques like persistent cache, partitioning techniques.
  • Provide production support for Informatica and Cognos based Energy Delivery Warehouse.
  • Data Modelling review and suggestions for new enhancements, provide support for any issues during SIT and UAT.
  • Working on Development Project Metric Data Tracking (KPI Dashboard) that provides insight on plant performance, Safety compliance and performance.
  • Handle application administration tasks for Informatica like user access, code Migration, repository backups, new folder and connections.
  • Work with business users to analyze and resolve issues.

Environment: Informatica 10.2.0 HF1, Denodo 7.0, Oracle 12c, SqlServer 2012, Cognos 11

ETL Architect

Confidential, Tempe, AZ

Responsibilities:

  • Interact with Business users and understand Business requirements. Translate Business Requirements into System Requirements and create Technical Design Documents.
  • Define and develop SAP BODS jobs to migrate data from legacy to SAP systems.
  • Develop Interfaces to sync data with SAP Business Warehouse using SAP BODS and SLT for data provision.
  • Design and develop Full and Incremental data loads with BODS.
  • Data Migration from Legacy to SAP system.
  • Use Information Steward for Data Profiling, parse and standardize data.
  • Incremental data loads, Change data capture, using mapping variables and Checksum function.
  • Designed, developed and support experience for dataflow design, user management, analytic privileges, creating Attribute, Analytic and Calculation Views
  • Troubleshoot Report issues and coordinate with different Business and IT Teams to resolve
  • Shell Scripts for Source File validation, FTP processes.
  • Extended Use of Parameter Files, Use of Command Line Utilities, Design ETL workflow with job dependencies and scheduling.
  • Develop Oracle database objects like procedures, functions and Triggers.
  • Assisted in developing test plans, support for System Test and User acceptance testing.
  • Creating deployment groups and Migrating Informatica components from one environment to another.

Environment: Informatica Power Center 9.5.1, PowerExchange NetWeaver for SAP, SAP BODS, Oracle 11g, SAP-ECC, UNIX Shell Scripts

ETL Developer

Confidential, Plano, TX

Responsibilities:

  • Interact with Business users and understand Business requirements. Translate Business Requirements into System Requirements and create Technical Design Documents.
  • Analyze business requirements. Handle complex Service and maintenance requests. Make necessary design changes to implement Business requirements.
  • Team Lead for a team of 2 people at the client location and 5 offshore.
  • Design and develop Informatica and BO based applications.
  • Developed and documented data Mappings using transformations to load data from OLTP systems into EDW.
  • Perform Design and Code Review on Requests worked upon by team members.
  • Analyze and simplify the development/maintenance by creating Re-usable Transformation Objects, Mapplets, Sessions, Work lets, Shortcuts.
  • Build Pearl script to FTP Mainframe datasets into flat file on the Informatica UNIX server.
  • Assisted in developing test plans, support for System Test and User acceptance testing.
  • Migrating Informatica, DB2 and UNIX components from one environment to another.
  • Monitor Production Environment and Identify Improvement areas.
  • Perform basic Admin activities like Manage alerts, Manage user security, Create folders, and Manage application services.
  • Scheduled Jobs using CNTRL-M scheduler.
  • Prepare Effort estimates for requirements and Change Request.
  • Coordinate work with offshore team

Environment: Informatica Power Center 8.6/9.1, Oracle 11g, TOAD, DB2, CNTRL-M, UNIX Shell Scripts, BO 4.0

SAP Integration Developer

Confidential, Fort Worth, TX

Responsibilities:

  • Interact with Business users and understand Business requirements. Translate Business Requirements into System Requirements and create Technical Design Documents.
  • Lead a team of 2 people at the client location and 7 people offshore
  • Design and develop both Inbound and Outbound Legacy to SAP Interfaces.
  • Design and develop Full and Incremental data loads with BODS.
  • Implement SCD1 and SCD2 type 2 implementation using BODS inbuilt transformations.
  • Data Migration from Legacy to SAP system using BODS.
  • Use Information Steward for Data Profiling, parse and standardize data.
  • Production support for 200+ Legacy-SAP Interfaces.
  • Develop Inbound and Outbound IDOC mappings.
  • Implemented Mappings using Web Services and Java Transformation.
  • Participate in RCB and CAB meeting for defect and enhancement approvals.
  • Monitor Production Environment and Identify Improvement areas.
  • Prepare Effort estimates for requirements and Change Request.

Environment: Informatica Power Center 8.6/9.1, Power Exchange SAP Net Weaver, Power Exchange Mainframe,Web Sphere MQ, AIX DB2, SQL Server 2008, AIX UNIX, Control M

Technology Analyst

Confidential, Richmond, VA

Responsibilities:

  • Lead a development team of 3 & Production Support of 4 people at offshore.
  • Interact with Onsite Leads and Business users and understand Business requirements. Translate Business Requirements into System Requirements and guide team to make design changes to applications.
  • Enhance and Support Informatica and Teradata based data warehouse applications.
  • Developed Informatica mappings to read data from Oracle transactional Databases into Teradata Tables.
  • Develop Teradata Bteq scripts to load data Pre Stage Tables using Load Utilities like Fast Export and Mload and Fast Load, TPUMP.
  • Expertise with Different Teradata Indexes like UPI, USI, PPI, Join Indexes, Columnar tables.
  • Knowledge of Teradata Temp table like Temporal tables, Volatile and Global Temporary Tables
  • Develop Teradata Bteq scripts for incremental data load from Pre Stage Tables to Warehouse.
  • Perform Design and Code Review on Requests worked upon by offshore team members.
  • Expertise Handle complex Service and maintenance requests.
  • Provide Support for any Claims and Membership DW applications.
  • DataMart for Individual member Information.
  • Monitor and Tune Load process
  • Document and Carry out Unit Test and Provide support for System and UAT.
  • Defect Prevention anchor for Claims and Membership DW applications.
  • Analyze production failures and other service request tickets opened by Business users.

Environment: Informatica Power Center 8.6, Power Exchange Mainframe,Oracle 10g, TERADATA VR12, UNIX, WLM.

Software Engineer

Confidential

Responsibilities:

  • Interact with Onsite and collect requirements on new enhancement and production issues.
  • Understand Business requirements and translate them into system requirements and make design changes to implement those requirements.
  • Design and develop Oracle PL/SQL stored Procedure and Packages.
  • Offshore Production support Co-coordinator for the Critical Data Upload Application.
  • Tracking of Production support activity on a day to day basis and sending the reports to Onsite Leads.
  • Perform Unit Testing and Document Unit Test results.
  • Involved in Code migration to different environments (Dev/Test /Prod).
  • Developed audit & error reporting system to track errors and report to business users.
  • Defect Prevention anchor for all the modules.
  • Configuration Management using VSS.

Environment: Orcale9i, UNIX Shell Scripting, Clarify, BMC Remedy, IPM+.

We'd love your feedback!