- 7+ Years in Data Analysis, System Analysis, Business Requirement Gathering, Data Modeling, and Technical Documentation.
- Good experience in all phases of Software Development Life Cycle SDLC with good Understanding of methodologies like Agile and Waterfall.
- Experience in Text Analytics, generating data visualizations using R, Python and creating dashboards using tools like Tableau.
- Working experience in using various Hadoop Ecosystems such as HDFS, Spark, Hive, Sqoop, Kafka, and Flume for data storage and analysis.
- Experience in setting up the enterprise infrastructure on Amazon Web Services (AWS) like EC2 instance and S3.
- Experience in performing data analysis on various IDE's like PyCharm, Jupyter notebook, Spyder.
- Experience in using various packages in R and python - like ggplot2, NumPy, Pandas, Matplotlib, SciPy for Data Analysis, and Model Building.
- Good experience in creating visualizations, interactive dashboards, reports, and data stories using Tableau, Power BI.
- Expertise in ETL (Data Extraction, Transformation & Loading), Data modeling, Report designing, and Performance Tuning using QlikView.
- Experienced in writing Parameterized Queries for generating Tabular reports, formatting report layout, Sub reports using Global Variables, Expressions, Functions, Sorting the data, Defining Data Source and subtotals for the reports using SSRS
- Experienced in Designing, Building the Dimensions, Cubes with Star Schema, and Snow Flake Schemas using SQL Server Analysis Services (SSAS) in SQL Server.
- Good Experience in working with SSIS, for the ETL process ensuring proper implementation of Event Handlers, Loggings, Checkpoints, Transactions, and package configurations.
- Proficient in Google Analytics and Google tag manager for implement scripts.
- Hands-on Experience in Querying RDBMS such as MySQL, Oracle, and NoSQL databases like HBase, HBase.
- Working knowledge in Normalization and De-Normalization techniques for both OLTP and OLAP systems in creating Database Objects like Tables, Constraints (Primary key, Foreign Key, Unique, Default), Indexes.
- Very well experienced using Microsoft Office Suite (Word, Excel, MS Access, and PowerPoint).
- Experience in working on source controller tools like GIT and accessing the JIRA tool and other internal issue trackers for the project developments.
- Highly creative, innovative, committed, intellectually curious with good communication and interpersonal skills.
Programming languages: SQL, Python, MATLAB, C
SQL Databases: Oracle SQL, MYSQL, SSMS, Mongo DB, Postgres SQL
Visualization tools: SSRS, Power BI, Tableau
Libraries: NumPy, Pandas, Scikit Learn, NLTK, Matplotlib, Seaborn
Integration tools: SSIS, Informatica
Other skills: Hive, Sqoop, Forsk Atoll, Xilinx, Visual Studio, GraphQL, JIRA, Git, MS Office, OLAP, OLTP
Methodologies: Waterfall, Scrum Agile, SDLC.
Packages: NumPy, Pandas, Matplotlib, SciPy
Analysis & Visualization: Tableau, Power BI, SPSS Statistics, Qlik View, SSRS
Web Analytics: Google Analytics, Google Tag Manager
ETL Tools: SSIS, SSAS
Operating System: Windows, Linux
Confidential, Jersey City,New Jersey
- Implemented and followed an Agile development methodology within the cross-functional team and acted as a liaison between the business user group and the technical team.
- Worked on predictive analytics use-cases using Python language.
- Integrated Jenkins with Docker container using Cloud bees Docker pipeline plugin and provisioned the EC2 instance using Amazon EC2 plugin.
- Clean data and processed third-party spending data into maneuverable deliverables within a specific format with Excel macros and python libraries such as NumPy, Pandas, SciPy, and Matplotlib.
- Perform Data Analysis on the Analytic data present in Teradata, Hadoop/Hive/Oozie/Sqoop, and AWS using SQL, SQL Assistant, Python, Apache Spark.
- Migrated Oracle data to the Hadoop cluster and transformed existing SQL scripts to Hive Query.
- Imported excel sheet data into SPSS to analyze and create a TDE Extract for tableau visualization.
- Validated the application using the QlikView document analyzer.
- Designed high-level ETL architecture for overall data transfer from the OLTP to OLAP with the help of SSIS.
- Used SSIS to create ETL packages (dtsx files) to validate, extract, transform and load data to the data warehouse, data mart databases, and process SSAS cubes to store data to OLAP databases.
- Used IDEs for data analysis Jupyter notebook and Spyder.
- Developing parameterized reports through SSRS with SQL queries and stored procedures for easier access by users.
- Produced operational reports in SSRS i.e., drill-down, drill-through, dashboard, and matrix reports.
- Utilized Google Analytics and Google tag manager and implemented new scripts.
- Analyzing Hadoop cluster and different Big Data analytic tools including Pig, Hive, HBase, and Sqoop.
- Handled importing of data from various data sources, performed transformations using Hive, MapReduce, loaded data into HDFS, and Extracted the data from Oracle into HDFS using Sqoop.
- Created SQL scripts for testing and validating data on various reports and dashboards and tracked performance issues effectively using JIRA.
- Maintained and coordinated environment configuration, controls, code integrity, and code conflict resolution and used Git for branching, tagging, and merging.
Environment: Agile, Python, Jupyter notebook, Spyder, EC2, NumPy, Pandas, SciPy, Matplotlib, AWS, SQL, Spark, Oracle, Hadoop, Hive, MS Excel, QlikView, OLTP, OLAP, SSIS, SSAS, SSRS, HBase, Sqoop, HDFS, JIRA, Git.
- Build dataset by using normalizing techniques as a part of data preparation for machine learning algorithms.
- Worked on creating datasets for Internal business dashboards and created automated scripts to implement sales forecasting model of external business.
- Building SSRS and Tableau reports.
- Perform analysis to support decision-making process.
- Develops user acceptance criteria, test plans and test cases. Responsible for conducting functional/user testing (User Acceptance Testing) and accountable for user sign off changes
- Apply knowledge of data, data structure and analytical tools to research and identify trends on physician data.
- Participate in peer review of software development cycle deliverables (e.g. SRS) for the enterprise extracts work and provide estimates for automation testing related work.
- Perform API testing using SOAP UI, REST API and Graph QL.
- Develops a strong understanding of the business unit’s function and effectively communicates technical issues and solutions in non-technical terms to the business unit.
- Developed Logical Model from the conceptual model.
- Interact with the report server, reports, and report related items by using application pages on the SharePoint site. Used the SharePoint document library and other libraries that I create to store the content types related to reports.
- Responsible for different Datamapping activities from Source systems.
- Involved with Data Profiling activities for new sources before creating new subject areas in warehouse.
- Perform multiple testing techniques to ensure high level check out of the code and to ensure connectivity of system across multiple platforms.
- Conducted Data Modeling review sessions for different user groups, participated in sessions to identify requirement feasibility.
- Eliminated errors in Erwin models through the implementation of Model Mart (a companion tool to Erwin that controls the versioning of models).
- Identify End to End test data flow architecture, prepare Integration test scenarios, coordination with all the system/application stakeholders, create test data, execute test scenarios, triage the defect and high priority issues with responsible team members and report results.
- Verified that the correct authoritative sources were being used and that the extract, transform and load (ETL) routines would not compromise the integrity of the source data.
- Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Creating data visualizations to effectively convey findings from KPI's Dashboard and scheduled reports for different dealerships using Tableau.
- Perform data analysis on all result, prepare and publish reports online to the customers.
- Created a pipeline process to load data from S3 storage system to MySQL Database.
- Perform Data transformation and Data cleansing methods on the data.
- Perform data exploration to identify insights in the data and prepare presentations for clients
- Integrated Tableau dashboard API’s with the client’s front facing website.
- Experience in preparing Test plans, test data and executions of test cases to ensure applications functionality meet the user requirements.
- Utilized RUP Rational Unified Process to create use cases, activity, class diagrams and workflow process diagrams.
- Extensive development providing data integrated solutions using VB 5.0.
- Responsible for providing database solutions using MS Access/ SQL Server/ Oracle database.
- Proficient use of T-SQL/ PL/SQL in creating tables, views, triggers, stored procedures.
- Created reports using Crystal Reports and Oracle Forms/Report.
- Provided technical/Functional support to end user.
Confidential - Mahwah, NJ
- Used Waterfall methodology throughout the project, Involved in weekly and daily basis release management.
- Worked on CSV files while trying to get input from the MySQL database.
- Generating various capacity planning reports (graphical) using Python packages like NumPy, matplotlib.
- Implemented Data exploration to analyze patterns and to select features using Python SciPy.
- Validated the system End-to-End Testing to meet the Approved Functional Requirements.
- Extracting the data tables using custom generated queries from Excel files, Access / SQL database, and providing exacted data as an input to Tableau.
- Responsible for making growth model for the organization by segmenting advisor using quadrant analysis in Tableau.
- Involved in SQL Optimizations, Performance Analysis, and future growth analysis for OLTP and data warehouse applications.
- Performed reverse engineering for a wide variety of relational DBMS, including Microsoft Access and Oracle.
- Planned monthly and quarterly business monitoring reports by creating Excel Pivot summary reports to include System Calendars.
- Creating user stories and maintaining and tracking them using JIRA.
- Managed code versioning with GIT and deployment to staging and production servers.
- Cognos Developer and data warehouse team for the implementation of various Cognos Projects.
- System administration for Cognos Planning and Cognos BI applications and Business Objects applications.
- Responsible in the setup of Cognos Security and Setup of User Groups, Roles, Etc.
- Supported efficient quality assurance and testing of various system applications.
- Created complex Dash boards for the business requirements and for the end users.
- Efficiently served as a liaison between the business and IT Business needs.
- Determined and analyzed the problems in service, performance and provided solutions.
- Load balanced the Cognos 8.4 Bi Multi server environment for performance issues.
- Built models containing query subjects, query items, and namespaces from imported metadata.
- Created Ad-hoc reports using Query Studio and Provide Cognos user support and troubleshooting.
- Checked performance issues for reports to make sure to use best sources to reduce production expenses.
- Worked for Data Warehouse using Cognos Data Manager 8.4.
- Created and developed new reports in Report Studio upon requests from business users.
- Developed Standard Reports, List Reports, Crosstab Reports, Charts, Drill through Reports and Master Detail Reports Using Report Studio.
- Created prompts, query calculations, conditions, formatting, conditional formatting, filters, multilingual reports Using Report Studio.
- Created Query Prompts, Developing Prompt pages and Conditional Variables.
- Created Bursting Reports and Multilingual Reports Using Report Studio and Query Studio.
- Worked with multi-query reports and charts.
- Created Models based on the dimensions, levels and measures required for the analysis studio-using Transformer.
- Worked on performance tuning of all the Cognos Cubes such as decreasing time in build times and published them to enterprise Server Used Framework Manager to build models, packages and publish packages to Cognos connection.
- Used Cognos Connection to administer and scheduling reports to run at various intervals.