Data Analyst Resume
Philadelphia, PennsylvaniA
PROFESSIONAL SUMMARY:
- Overall 7.7 years of rich experience as Data Analyst
- Comprehensive knowledge on several business areas, with Telecommunication, Retail, Banking and Finance being the strongest.
- Solid Experience in the subject area of Business Intellignece Datawarehousing, Reporting (Tableau) and ETL process as Data Analyst
- Extensive knowledge on Data warehouse Concepts such as designing and implementation of Data Marts with Star, Snow flake schemas and thorough understanding of new age Data warehouse concepts like RDM, MDM, ODS, RDWs
- Proficient across the entire pipeline from data wrangling, feature selection, model evaluation, API creation, up to data product
- Worked on variety of tools including Tableau, Datameer, Spotfire, Informatica, Microstatergy and Nice Actimize
- Expertise in building complex SQL Queries to perform data mining, data analytics, and data extractions across databases using WinSQL, Putty and Dbeaver
- Extensive experience on building dashboards using Tableau from various database ( MELD, HIVE and Vertica )
- Experience in creating different visualizations using Bars, Lines and Pies, Maps, Scatter plots, Gantts, Bubbles, Histograms, Bullets, Heat maps and Highlight tables.
- Designed and Optimized Connections, Data Extracts, Schedules for Background Tasks and Data Refreshes for corporate Tableau Server.
- Proficient in writing complex SQL queries, stored procedures, normalization, database design, creating indexes, functions, triggers and sub - queries.
- Proficient in tuning queries used in the application
- Experience in Developing, Analyzing and programming Machine Learning Algorithms based on Supervised an Unsupervised Learning
- Professional experience of working on different SDLC methodologies such as Agile (Scrum), Waterfall & Rational Unified Process
- Highly capable of facilitating Joint Application Development & Modeling (JAD, JAM) sessions, Requirements Workshop sessions, user interviews and of acting as a liaison between the clients, Managers, Consultants, End users, Developers, QA and all other stake holders of the project.
- Proficient in preparing Test Scenarios, Test Cases, & Test Artifacts for successful execution of the product.
- Involved in User Acceptance Testing (UAT) and in various testing strategies. Knowledge in software testing, process testing and quality control.
- Competent in providing complete solutions to satisfy all Business and Technical needs including: writing and running SQL, BI and other reports, analyzing data, creating metrics/dashboards/pivots/etc.
TECHNICAL SKILLS:
Tools: Tableau, Spotfire, Informatica 9.6.0, Microstatergy, Test Management Tool (QC 9.2, ALM 11),SVN, Shiny with R
Languages: SQL, R, Python, C, Esqlc, UNIX shell scripting
Database: DB2, Informix, NoSQL (Mongo DB), MySQL, WinSQL, Dbeaver
Data Science: Predictive Analytics, Fraud Detection, Pattern Mining, Sentiment Analysis, Anomaly Detection, Outlier Detection
Machine Learning Algorithm: Classification (Random Forest,K-nearest neighbor,Naives Bayes,Ada Boost,SVM ) Clustering (K-Means,Hierarchical clustering), Regression - (Linear Regression,Logistic Regression)
Analytics: Python (NLTK, NLP, Scikit-learn, Numpy, Panda, SciPy, plotly), R
Data Engineering: Data Mining (Python, R, SQL)
WORK EXPERIENCE:
Confidential, Philadelphia, Pennsylvania
Data Analyst
Responsibilities:
- Collecting the requirements from the internal stakeholders, sourcing the Data, ETLing it into Vertica environment, and then building Data Visualizations on top of it.
- Fetched data from various input sources, performed data mapping and calculated the session length
- Looked for anomalies in the data using patter detection
- Calculated the Utilization and weekly capacity using Pentaho ETL process
- Created jobs to fetch data for a specific time interval
- Integrated various table to generate peak time failover and peak mbps during failover using complex query
- Created visualization on the impact of the VOD failover using geo plot, bar and Scatter plots, Bubbles, Histograms, Bullets, Heat maps and Highlight tables.
Confidential
Data Analyst
Responsibilities:
- Collecting the requirements from the internal stakeholders, sourcing the Data, ETLing it into Vertica environment, and then building Data Visualizations on top of it.
- Calculated and created demand data for various service flows at Region and Natioal Level
- Aggregated the view to pull up and aggregate data at weekly and monthly level
- Developed automation in Unix script to populate the dashboard on weekly basis
- Developed complex queries to perform calculations from tables and calculate the demand at service level
- Created visualization in tableau, after integrating the data from from views that were created in vertica database .
- Generated area, geo and bar plots for the Traffic KPI Dasboard and pulished it to Confidential server with daily refresh
- Designed Summary and Detailed reports using Data Blending feature in Tableau
- End to end experience in designing and deploying data visualizations using Tableau
- Generated Dashboards for Traffic data with Quick filters, Parameters and sets to handle views more efficiently.
- Designed and Optimized Connections, Data Extracts, Schedules for Background Tasks and Data Refreshes for corporate Tableau Server
Confidential
Data Analyst
Responsibilities:
- Collecting the requirements from the internal stakeholders, sourcing the Data, ETLing it into Vertica environment, and then building Data Visualizations on top of it.
- Fetched data on MAC’s that changed their IP’s from IPV4 to IPV6
- Forecast the impact of IP change and optimize the output on tableau dashboards .
- Aggregate the view to pull up and summarize data at month over month and year over year
- Created views in vertica using SQL to optimize the query execution .
- Developed automation in Unix script to populate the dashboard on weekly basis
- Developed complex queries to perform calculations from tables and calculate the impact on site level utilization
- Created visualization in tableau, after integrating the data from from views that were created in vertica database .
- Generated area, scatter and bar plots for IP Migration dashboard
- Generated Dashboards for Traffic data with Quick filters, Parameters and sets to handle views more efficiently.
- Designed and Optimized Connections, Data Extracts, Schedules for Background Tasks and Data Refreshes for corporate Confidential Tableau Server
Technical Work: Expertise in Tableau, Spotfire, Vertica, Knime, MELD, Datameer, My SQL, Pentaho ETL, Unix Script
Confidential
Data Analyst
Responsibilities:
- Conducted Data Analysis and modeling for inputs into NICE- Actimize Fraud Detection system / Transaction monitoring
- Worked with Business, Operations and Technology stakeholders to identify critical data elements, define data quality rules, and implement data quality measurement and monitoring
- Experienced in enhancing and designing platform for Suspicious Activity Reporting (SAR) and/or high risk customer investigations
- Developed FIU (Financial Intelligence unit ) key risk indicators to monitor effectiveness of AML models and provide insightful recommendations to improve performance
- Configured Actimize Suspicious Activity monitoring (SAM) for Phase based Analytics (Scenario based Detection and Dynamic Anomaly Detection)
- Standardized AML processes by leveraging integrated workflow and case management capabilities, for effective and automated compliance using rules and workflows to meet regulatory requirement.
- Responsible for update of Policy Changes in governance and data management for rules to work as designed.
- Worked with the MDM (Master Data Management) for data integration, data quality, and business process management (BPM).
- Sound knowledge in OFAC, BSA/AML, and USA PATRIOT Act and risk management principles
- Incorporated and ensured the impact of Key attributes and understanding the data for it to be processed by the tool.
- Designed queries to perform data mining, data analytics, and data extractions across AML databases using various data mining systems and tools (R, SQL, etc)
- Performed Root cause analysis of all data quality problems with follow-through to resolution
- Created complex reports and metrics for: Compliance, AML, Risk, KYC(Know Your Client) and Anti-Fraud using SQL and Business Intelligence
- As a Data Analyst with knowledge on FinCEN guidance, performed AML investigations, SAR filing, risks and controls, and potential red flags for money laundering and/or terrorist financing.
- Performed quantitative and qualitative analysis of data used to prepare and report various metrics of the bank to internal and external parties using PowerPoint, Excel, and Tableau
Technical Work: Expertise in Nice - Actimize, Tableau, AML and KYC process, MS Project
Confidential, Silicon Valley, California
Data Science Associate
Responsibilities:
- Designed code to connect to different social API’s for pulling up data and created a model to best fit the output.
- Preprocessed the review text, including removing HTML markups, non-letter expressions and stop words, and tokenization
- Used Natural Language Toolkit (NLTK) and Natural Language Processing (NLP) in python to tokenize the words
- Extracted the feature to model.
- Developed Naïve Bayes Algorithm for Text mining and classification on the training and test set of data.
- Naives bayes works on simple classification on words based on ‘Bayes Theorm ‘; it uses bag of words (Text represented as collection of words) for subjective analysis of content.
- Provided Dictionary Generation (count occurrence of all words in whole dataset) and Feature set Generation (all documents represented as feature vector over the space of dictionary words) to perform Classification .
- Performed Sentiment Analysis on the Data that was fetched after Data cleaning using bag of words and probabilistic method for scoring data.
- Tested accuracy on the test set to validate the model
- Deployed the code in Shiny App and implemented the functionality to fetch result for Non Technical users.
Confidential
Data Engineer/Data Quality Analyst
Tools: /Technology: Informatica 9.6.0, DB2, Microstatergy and UNIX
Responsibilities:
- Gathered requirement and analyzed the scope of Business specification
- Developed Technical Architecture for ETL for source data, staging, Dimensional data store Schema and DBMS
- Developed ETL Strategy and plan derived from user requirements and technical architecture that is needed to support ETL process
- Developed ETL Design and rules for moving data into target data warehouse
- Creating Transformation of types Aggregator, Sorter, Filter, Router, Rank, Look up and Stored Procedures
- Implementing various types of Data Transformation Formatting, Derived, Default, Validation and Cleansing
- Loading data using Full / Incremental and Replace / Append Approach
- Implemented Slowly Changing Dimension Type 1, Slowly Changing Dimension Type 2 and Slowly Changing Dimension Type 3 using ETL
- Loading various Fact tables (Fact Load, Snapshot Load, Aggregate Load)
- Implementing Normalization on the Fact table
- Designing Star Schema and Snow Flake Schema on Dimensions and Fact Tables
- Was involved in all the three stages of Data warehousing
- Analysis, reporting and tracking of defects on a daily basis.
- Creating, adding task and session task in a ETL workflow
- Monitoring code using ETL workflow Monitor screen
- Debugging ETL session and ETL workflow run properties for errors and fixing the errors
- Handled error using Validation Rules, Translation, and Rejecting the data
- Reconciliation of data in Data warehouse with data in Source using complete reconciliation and stage reconciliation
- Optimizing and performance tuning of the data.
- Coordinate with the various project teams to ensure full coverage.
- Responsible for reproducing production bugs and validate bug fixes.
- Handling up module independently and taking up the responsibilities.
- Have provided knowledge transfer to the team members.
- Logging defect and tracking the defects through QC.
- Rebuilt the existing UNIX scripts to run the ETL Process.
- Compared “standard” reports/queries in the data warehouse to specified source system reports
- Developed data warehouse queries and compared to specified source system reports
- Performed UAT on data warehouse and source system queries/reports/screen prints are filed for bug research
- Experienced in Test Management Tool (QC 9.2, ALM 11)
Technical Work: Expertise in Informatica, DB2, Microstrategy and UNIX Shell scripting
Confidential
Programmer Analyst
Responsibilities:
- Supported entire Wal-Mart groceries Distribution Center around the globe.
- I have analyzed and found out issues with the code and have also fixed them with 100 % code compliance.
- Worked under most pressurizing situation with calm and composure to enable DC operation.
- Have learnt the Retailer’s business flow and supply chain in order to support their DC’s effectively.
- Suggested solutions for various change requests, by interpreting business needs into technical solutions.
- Communicated effectively with the client all over the world and have created a good rapport with them..
- Have worked in most of the phases in Software Development Life Cycle.
- Create/Modify shell scripts to reduce day to day efforts and remedy ticket count.
- Handling one of the major modules in project, gained complete insight and more confident on handling the issues
- By the means of functional knowledge gained, I have attributed an effective process flow for supporting Wal-Mart application
- Fixed defects by following the process of preparing Technical documents, test cases and also have done end to end functional testing.
- Worked on the enhancements and fixes which requires more code level changes.
- Attended frequent meetings (on call) with the clients to clarify on the issues and have facilitated faster resolution.
- Maintained and documented a good track of all issues resolved and reported status.
- Supported move to production using SVN tool.
- Training the new associates on functionality of the project.
- Involved in User Acceptance Testing (UAT) and in various testing strategies. Knowledge in software testing, process testing and quality control.
Technical Work: Expertise in Complex SQL Query, Unix Shell Scripting and ESQL C