We provide IT Staff Augmentation Services!

Application Developer, Technical/data Analyst, Techno Functional Lead Resume

TECHNICAL SKILLS:

Languages: PL/SQL(Advanced), Python, Unix

Reporting BI Tools worked on: Tableau, Splunk

Databases Worked On: Oracle, Teradata, Vertica, MYSQL

ETL Tools Worked On: Informatica

Theory and Techniques: Data Warehousing, Business Intelligence, Data Mart

Version Control Tools: SVN, GIT, Autosys(Jil Scripting), ETL, OOZIE

Core Data Migration: Kafka

Operating System: Mac OS, Windows

PROFESSIONAL EXPERIENCE:

Confidential

Application Developer, Technical/Data Analyst, Techno Functional Lead

Technologies/Databases used: Advanced SQL, Kafka, Teradata, Tableau Reports, Confidential Internal GSF framework

Responsibilities:

  • Confidential agents use this tool to cater to all the needs of the customers for Confidential online sales and services
  • This tool helps us to analyze the agents performance as to what is the agent resolution time, after call work time, agent handle time etc
  • We record every call and chat that comes in through this tool and how many orders were placed online with the help of the agents. Chat transcripts are analyzed to see how well the agent is able to help the end customers and feedback is taken from the customers to see how satisfied are they with the help rendered
  • Analyze, design, develop and implement the changes on new requirements for back end processing
  • Developed several detailed and summary reports including graphical representation, trend analysis reports according to business requirements

Confidential

Application Developer, Technical/Data Analyst, Techno Functional Lead

Technologies used: Advanced SQL, Oracle, Vertica, Tableau Reports, Confidential Internal GSF framework

Responsibilities:

  • Records the number of online sales that have been made, services provided by the agents to the end customers, agent performance, call and chat information between the agents and the end customers, invoice and delivery information
  • Real time sales data is reported in the dashboard joining the data from different data sources like Oracle DB, Informatica DB, Kafka and Storm DB
  • Migration of data source was done from Hadoop database to Kafka queues for faster processing and real time data availability at the semantic layer to build aggregated data for better performance and better SLA. Confidential internal tools (GDT/Splunk) were used to deploy/debug in production servers
  • Automated the process of creating views by writing a complex PL/SQL script using cursors which validates the business rules and creates a view. This logic handles the process of creating views restricting PII access

Confidential

Application Developer

Technologies used: Advanced SQL, Vertica, Teradata, User Data Management tool, Tableau

Responsibilities:

  • During Confidential NPI(New Product Induction) Confidential business demands a quick turn around time for the orders placed on the AOS, retail stores etc to be available at the dashboard within a short SLA from the time it is available at SAP
  • Developed various aggregated solutions to pull data from SAP and other sources into EDW using Teradata and Vertica
  • Created complex analytic Tableau reports on top of the historical and current available Enterprise Data Warehouse.

Confidential

Application Developer

Technologies used: Teradata, User Data Management tool, Tableau reports

Responsibilities:

  • Confidential is an inventory tool used to manage inventory and backlog to ensure Confidential ’s are holding the correct level of products in the right locations in order to maximize sell thru
  • Developed aggregated solutions to handle data of various resellers and inventory and display these in a tableau dashboard

Confidential

Application Developer

Technologies used: Advanced SQL, Teradata, User Data Management tool, Tableau, Informatica

Responsibilities:

  • Confidential is a collection of projects involving business from AMR, APAC and EMEA in the sales and operations area
  • Designed different front end reports and aggregated semantic based on the data set available for resellers, Confidential quotes, emerging markets, retail back of house, inventory as well as analysed different production quality data
  • ETL processing using Informatica workflows and inducting this data in the EDW(Enterprise Data Warehouse) core for further processing in the EDW Semantic

Hire Now