We provide IT Staff Augmentation Services!

Data Analyst/data Architect Resume

SUMMARY:

  • Lead Data Analyst and Data Engineer with 10+ years of experience capable of evaluate, design & engineer data products ready for data mining and design modern datawarehouse.
  • Helping Telecom, Media, and Technology firm in their real time targeted advertising and data analytics platform Confidential the same time specialize in Healthcare Data Analytics with Claims, Provider & Membership.
  • Responsible for defining data pipeline input strategy for Data Analytics (AI) Products and Data Streaming Services across digital org for helping client digital transformation and customer personalization objective.
  • Focus on the design consultation of innovative, portable, interfaceable data solutions for digital organization using cutting edge technologies (No SQL DBs, Data/Message Routers, Streaming App etc. like Kafka).

TECHNICAL EXPERIENCE:

Bigdata: (Hive, Flume, Sqoop, HBase, Ambary, MapReduce with PySpark/Python) - 2.5 Year(Basic)

Database: Teradata-10 Year, Vertica-2.5 Year | No SQL DB - Cassandra - 1 Year (Basic)

ETL: Informatica- 7 Year

Language: Python - 1 Year, UNIX Scripting - 4 Year

Concept: Data Modeling(NO SQL and Relational) -3 Year, Data Warehousing- 10 Year, Data Architecture - 5 Year

SDLC: Agile Scrum (Rally, TDP), Agile Safe (Agile Craft), Water Fall.

Cloud: AWS (Learning)

EXPERIENCE SUMMARY:

Data Analyst/Data Architect

Confidential

Responsibilities:

  • Act as Data Strategist for Analytics Application data needs. Discover & Understand in Enterprise Data Repository, Data Lake, Datamart and various Reporting Databases.
  • Coordinate & Organize Meeting with stakeholders and concerned team like Business Client, Sales Teams, Legal, Marketing, Data Scientists, Other Analytic Platform, Content Management, Data Streaming and various support teams etc. for identifying and fulfillment of Use Case Analysis & Data requirement.
  • Responsible for Datamining of Log data (Jason, XML etc.) utilizing available tools and languages like Python, PySpark, Hive, Cassandra etc. and make available for analytics need.
  • Responsible for Design database model (Conceptual/Logical & Physical for NO SQL/Relational) solutions to satisfy application (business and technical requirement).
  • Create and Design Reports.
  • Design and Define Businesses Complex Logic/Data Transformation Mapping (STTM) for heterogeneous sources.
  • Provide support to Solution Architect by means of identifying architectural change, evaluation of data migration tools, consulting external vendors for external data source etc.
  • Provide production support for data integrity to different support teams.
  • Responsible for understanding data mapping business requirement and converting them into ETL Design, Coding and Scheduling the same.
  • Data Analysis for structured/non-structured data using SQL, SPARK.
  • Ensure and Recommend implementation guidelines for data security compliance like SPI, CPNI & RPI.
  • Define the business logic Statistical Model data need- this will include statistically analyze data with basic statistical technique in SPSS (Mean, SD, Variance, Skewness etc.), Profile (Source, DB, Table & Column), Identifying natural keys for new data sources, defining granularity, Develop Business Logic into SQL Queries.
  • Create Data Sampling for Analytical Model heterogeneous data sources like Enterprise Repositories (ECDW, EDM, Data lake & Data Marts)

Data Architect /Data Product Engineer

Confidential

Responsibilities:

  • Solutioning (Designing, Developing & Deploying) Digital Data Products Pipelines for Analytics Application.
  • Development and Design for Data Migration from Heterogeneous data source from Hadoop Datalake, Teradata & Vertica to perform analytics.
  • Provide solution for design and defining metadata Enterprise Data Streaming Platform like Kafka, Confidential Infosphere Streams to real time event message communication for seamless integration across enterprise channels.
  • Architect Data Integration with Real Time and Rest Data source like Adobe Firehose Stream, Data Lake (on HDFS), Golden Gate, Data/Message Router, Teradata & Vertica Platforms.
  • Identify, resolve & design the architectural changes/need for Application Analytics platform.
  • Responsible for Developing tools for Data Exploration and Auto SQL building.
  • Responsible for Performance Tuning the Teradata/Vertica Scripts and Informatica Workflows & Mapping.
  • Design and Develop the complex ETL flows (3K -20K LOC) for the complex mapping like Claims Billing Reduction/Rolldown, Discounts, Membership Enrollment etc.
  • Responsible to conduct meeting with Stakeholder and Support teams for End to End Design, Development and Deployment like App DBA, Release Management, Production Support, Data Integration, Data Modeling, Project PM, Delivery Managers and CA Schedulers etc.
  • Create and review the project Artifact like HLD, TSD, Job Scheduling Design, Test Data Requirement (DCR), ETL Spec, Production Deployment Plan & Schedule, Off Cycle Delivery Plan Deployment.
  • Responsible for Production Technical/Business Issue Fixes, ETL Re-Design work.
  • Plan, Audit & Performance tune the Teradata and Vertica SQL queries in SQL Assistant & in SPSS
  • Responsible creating and reviewing LOEs, help PM on resource requirement.
  • Performed Onshore/Offshore Coordination, Coaching Resources on Modeling, Technical & Business Requirement as well as on New Concepts & Technology of Teradata & Informatica.
  • Create Unit test Cases
  • Profile feeds from Application Data Sources
  • Design Informatica ETL flows in Mapping Designer, Workflow Designer and Workflow Monitor.
  • Coach new resources on ETL process.

Hire Now