We provide IT Staff Augmentation Services!

Data Analyst/business Analyst Resume

2.00/5 (Submit Your Rating)

Fremont, CA

PROFESSIONAL SUMMARY:

  • 7+ years of professional experience as a Data Analyst with excellent understanding, analyzing and documenting Business Requirements, Functional Specifications, Business Process Flow, Business Process Mapping, data extraction, manipulation, visualization, validation techniques and Data Modeling.
  • Experience in all phases of Software Development Life Cycle (SDLC), Waterfall and AGILE methodologies. Performed various system - related analytical activities within all the phases.
  • Ability to gather and document Business Requirements, experienced in writing Use Cases. Proficiency in SDLC life cycle, understands the workflow concept, ability to gather and document.
  • Expertise in various phases like understanding the User Requirements, Analysis/Design, Testing, Project Management, & Product Development along with end to end product delivery across the SDLC & STLC phases.
  • Experienced in requirements gathering and analysis using various elicitation techniques like interviews, surveys, JAD sessions, observation, prototyping and brainstorming.
  • Experienced in various diagramming techniques like wireframes, Process maps, flowcharts, functional demonstration, context diagrams and BPM modelling techniques.
  • Worked with AWS Cloud platform and its features which includes EC2, VPC, RDS, EBS, S3, CloudWatch, Cloud Trail, CloudFormation and Autoscaling etc.
  • High expertise in tracing requirements throughout the development process and verifying adherence to Requirement Traceability Matrix (RTM).
  • Expertise in performing GAP analysis, Requirements Traceability Matrix (RTM), Data mapping and Data Modeling.
  • Have good working knowledge and experience on Microsoft Word, Excel, Visio, PowerPoint and HP Quality Center.
  • Experience in conducting GAP analysis, SWOT analysis, Cost benefit analysis and ROI analysis.
  • Used AWS command line client and management console to interact with AWS resources and APIs.
  • Hands-on experience with MS Visio and use case diagrams for creating data flow/process flow diagrams such as flowcharts, activity charts, sequence diagrams as per UML.
  • Knowledge of Data Warehousing concepts and Extract, Transform and Load (ETL) processes.
  • Experience in Data-Modelling, Data Warehousing, Schemas, Data Marts and Extract/Transform/Load (ETL).
  • Key strengths include analysis using BI tools, COSMOS with Scope Scripts, Reporting using Power BI, ETL using SSIS, requirement gathering, RDBMS concepts, SQL Queries and preparation of design documents.
  • Experience in data analysis, data visualization using Tableau/Microsoft Power BI, model building with machine learning algorithms for prediction and forecasting using data (historical or time series with regression techniques), using statistical/econometric tools like SAS and R, data mining using python, SQL, Hadoop, Spark, Hive, etc.
  • Reviewed basic SQL queries and edited inner, left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
  • Strong skills in SQL, data warehouse, data exploration, data extraction, data validation, reporting and excel.
  • Used tools like Tableau and Microsoft Excel for data analysis and generating data reports
  • Developed visualizations using Tableau for better understanding of data, performed data cleaning, normalization, data transformation
  • Extensively used SQL for accessing and manipulating database systems.
  • Adept in RDBMS such as Oracle, MS SQL Server and MS Access &also skilled at writing SQL queries and Stored Procedures.
  • User Acceptance Testing (UAT) and Manual testing (Functionality Testing) of UI and Web Applications. Extensively worked in creating Test Procedures, Test plans, Test cases and reviewing them for quality assurance.
  • Involved in System Integration Testing (SIT), Regression Testing, GUI Testing, Performance Testing & User Acceptance Testing (UAT).

TECHNICAL SKILLS:

Programming Languages: C, Visual Basic, and C++, VB 6.0, SQL, Hadoop (Hive, Pig), Python, R.

Scripting Languages: MS-DOS, Bash, Korn.

ETL tools: Confidential Power center, SSIS, AB Initio.

Data modelling: Sybase Power Designer / IBM Data Architect.

Frameworks: Struts, Spring, Hibernate, Spring MVC, Spring Web Flow, Spring IOC, Spring AOP, Groovy.

Application/Web Servers: JBoss, Glassfish 2.1, WebLogic, Web Sphere, Apache Tomcat Server.

MS-Office Package: Microsoft Office (Windows, Word, Excel, PowerPoint, Visio, Project).

Visualization tools: Tableau Desktop, Python, Pandas, NumPy, Datorama.

ETL Tools / Tracking tool: Confidential, SSIS, SSAS, SSRS / JIRA.

Database Development: T-SQL and PL/SQL, Microsoft Hyper-V Servers.

Databases: Teradata R12 R13 R14.10, MS SQL Server, DB2, Netezza.

Testing and defect tracking Tools: HP/Mercury (Quality Center, Win Runner, Quick Test Professional, Performance Center, Requisite, MS Visio & Visual Source Safe.

Operating Systems: Windows, UNIX, Sun Solaris.

PROFESSIONAL EXPERIENCE:

Confidential, Fremont, CA

Data Analyst/ Business Analyst

Responsibilities:

  • Analysed the requirements and segregated them into high level and low-level Use Cases, activity diagrams using Rational Rose according to UML methodology thus defining the Data Process Models
  • Was responsible for indexing of the tables in that data warehouse. Used senior level SQL query skills (Oracle and TSQL) in analyzing and validating SSIS ETL database data ware house processes
  • Converted Business Requirements to the Functional Specification and Conducted JAD Sessions to develop an architectural solution that the application meets the business requirements, resolve open issues, and change requests
  • Analyzed the data warehouse project database requirements from the users in terms of the dimensions they want to measure and the facts for which the dimensions need to be analysed
  • Responsible for physical/logical data modeling; metadata documentation; user documentation; and production specs
  • Identify business rules for data migration and perform data administration through data models and metadata
  • Used Toad Data Analysts to connect to Oracle DB2 for data analysis
  • Create and maintain requirements, test cases and defects in Mercury Quality Center
  • Develop / Auto deploy content using AWS (Amazon Web Services), GIT/Bitbucket, Maven, Jenkins
  • Develop custom tools/scripts/packaging solutions for AEM using Java/Unix
  • Develop integration solutions between AEM, AWS (Lambda, S3, API Gateway and Cloud Formation) and Spredfast (Social) Platforms
  • Developed detailed ERDs and DFDs using various modeling tools and developed databases based on the system model following the techniques in SDLC (software development life cycle)
  • Gathered, Analyzed and Translated systems requirements into technical specifications utilizing UML and RUP methodology. Responsible for creating different sessions and workflows to load the data to Data Warehouse using Confidential Workflow Manager
  • Worked on AWS Elastic Beanstalk to deploy, monitor, and scale an application
  • Create new EC2 instance in AWS, allocate volumes and giving Provisionals using IAM
  • Analyzing the existing reports, reporting system. Worked on exporting data using Teradata
  • Utilized Agile/ SCRUM and PMI methodologies to monitor steer and develop project objectives
  • Implemented transformation component of DataStage to integrate the data and to implement the data to implement the business logic. Created Test Cases and scenarios for Unit, Regression, Data Integration as well as Back end and System testing
  • Strong interpersonal and communication skills within all levels of the organization and familiarity of regulatory mandates and internal controls
  • Proposed solutions for reporting needs and developed prototypes using SQL and Business Objects that address these needs. Developed data conversion strategy for data migration from legacy systems to Technology product modules
  • Familiarity with reporting and Business intelligence, Toad Data tools such as Crystal Reports
  • Develop and maintain sales reporting using MS Excel queries, SQL in Teradata, and MS Access
  • Extensive experience in testing and implanting Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Confidential
  • Using Crontab in UNIX and Scheduler in Windows and Jenkins for Scheduling
  • Responsible to design, develop and test the software ( Confidential, PL SQL, UNIX shell scripts) to maintain the data marts (Load data, Analyze using OLAP tools). Authored functional requirements documents (FRD) by interacting with development team for improving client's legacy system
  • Extensive Data Warehousing experience using Confidential as ETL tool on various data bases like Oracle, SQL Server, Teradata, MS Access
  • Experience working with Microsoft Outlook to set up meetings, manage calendar etc.
  • Experience working with Peer for maintaining Weekly Status and Planning

Environment:: Microsoft Outlook, MS Access, MS Excel, MS Word, MS Project, MS Visio, TOAD, JAD, Teradata Test Track, Peer, Mainframe, OLAP, ERWIN, ER Studio, Clear Quest, Clear Case, Java, DB2 Database.

Confidential, Murray Hill, NJ

Data Analyst/ Business Analyst

Responsibilities:

  • Analysed User Requirement Document, Business Requirement Document (BRD), Technical Requirement Specification and Functional Requirement Specification (FRS). Responsible for the development of a Data Warehouse for personal lines property and casualty insurance
  • Worked on data modelling and produced data mapping and data definition documentation
  • Created workflow scenarios, designed new process flows and documented the Business Process and various Business Scenarios and activities of the Business from the conceptual to procedural level. Provide customer service and support for Property & Casualty risk management needs
  • Used the guidelines and artifacts of the Rational Unified Process (RUP) to strategize the Implementation of RUP effort in different iterations and phases of the SDLC
  • Developed a file of ACORD Forms used as the standards in all Property and Casualty markets, for both Personal and Commercial Lines of Business
  • Used Amazon IAM to grant fine-grained access to AWS resources to users. Also, managed roles including permissions of users to AWS account through IAM
  • Responsible for performing back-end analysis and testing using ETL tool and Quest Toad
  • Designed and Administered Teradata Scripts, Tables, Indices and Database Objects
  • Created context and workflow models, information and business rule models, use case and object models during the analysis using rational tools. Performed data mapping, logical data modelling, created class diagrams and ER diagrams and used SQL queries to filter data
  • Setup the Jenkins jobs for Continuous integration process and to execute test cases
  • Helped to develop Data Migration and Cleansing rules for the Integration Architecture (OLTP)
  • Developed schemas for extraction, transaction, and loading (ETL) using Solonde Warehouse Workbench to expedite data integration between systems
  • Created physical and logical models and used Erwin for Dimensional Data Modelling
  • Created ER (Entity Relationship) Diagrams, function relationship diagrams, data flow diagrams and enforced all referential integrity constraints using ER Studio
  • Loading staging tables on Teradata and further loading target tables on SQL server via DTS transformation Package
  • Identify business rules for data migration and perform data administration through data models and metadata
  • Provided broad enterprise business system deployment analysis and support of Oracle Database
  • Followed a structured approach to organize requirements into logical groupings of essential Business Processes, Business Rules, and Information needs that insures any Critical Requirements are not missed
  • Extensively used Visual Basic to create macros to make data migration testing more useful and compared results from the legacy databases to the new CRM database
  • Followed the UML based methods using Microsoft Visio to create: Use Cases Diagrams, Activity Diagrams, State Chart Diagrams, Sequence Diagrams and Collaboration Diagrams
  • Prepared Logical Data Models that contains set of Entity Relationship Diagrams and Data Flow Diagrams and supporting documents and descriptions of the Relationships between the data elements to analyse and document the Business Data Requirements
  • Prepared system Test Plan and scenarios for Test Scripts. Responsible for providing analytical support in the Design, Development and Implementation of Project
  • Verified the Business Scenarios on new builds to allow extended testing by the QA team

Environment:: Rational Enterprise Suite (Rose, ClearCase, ClearQuest), RUP, LoadRunner, Visual Basic, SQL, SQL Server, Toad, Oracle, Mainframe, WebLogic, ERStudio, HTML, WinRunner, ERWIN, Project Management.

Confidential, Jacksonville, FL

Data Analyst

Responsibilities:

  • Worked with BI team in gathering the report requirements and Sqoop to export data into HDFS and Hive
  • Involved in the below phases of Analytics using R, Python and Jupyter notebook a. Data collection and treatment: Analysed existing internal data and external data, worked on entry errors, classification errors and defined criteria for missing values b. Data Mining: Used cluster analysis for identifying customer segments, Decision trees used for profitable and non-profitable customers, Market Basket Analysis used for customer purchasing behaviour and part/product association
  • Developed multiple Map Reduce jobs in Java for data cleaning and pre-processing
  • Assisted with data capacity planning and node forecasting
  • Installed, Configured and managed Flume Infrastructure
  • Administrator for Pig, Hive and HBase installing updates patches and upgrades
  • Worked closely with the claims processing team to obtain patterns in filing of fraudulent claims
  • Worked on performing major upgrade of cluster from CDH3u6 to CDH4.4.0
  • Developed Map Reduce programs to extract and transform the data sets and results were exported back to RDBMS using Sqoop
  • Have good exposure to GIT, Jenkins, JIRA
  • Patterns were observed in fraudulent claims using text mining in R and Hive
  • Exported the data required information to RDBMS using Sqoop to make the data available for the claims processing team to assist in processing a claim based on the data
  • Developed Map Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW
  • Adept in statistical programming languages like Rand Python including Big Data technologies like Hadoop and Hive
  • Experience working as Data Engineer, Big Data Spark Developer, Front End Developer and Research Assistant
  • Created tables in Hive and loaded the structured (resulted from Map Reduce jobs) data
  • Using HiveQL developed many queries and extracted the required information
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics
  • Was responsible for importing the data (mostly log files) from various sources into HDFS using Flume
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data
  • Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems
  • Managed and reviewed Hadoop log files
  • Tested raw data and executed performance scripts

Environment:: HDFS, PIG, HIVE, Map Reduce, Linux, HBase, Flume, Sqoop, R, VMware, Eclipse, Cloudera, Python.

Confidential, Baton Rouge, LA

Data Analyst

Responsibilities:

  • Interacted with users and business analysts to gather information and requirements
  • Involved in defining source-to-target data requirements, overseeing and signing off on data mapping specifications for the data warehouse
  • Data validation required the understanding and application of various business-coding rules to compare data on the existing system and the new business warehouse
  • Extensively Used Inner Join, Outer join while creating SQL Queries from multiple tables
  • Worked on Data Verifications and Validations to evaluate the data generated according to the requirements is appropriate and consistent
  • Wrote SQL targeting performance and recommended to define indexes on the tables as per the need
  • Developed processes on both Teradata and Oracle using shell scripting
  • Performed in depth analysis in data & prepared weekly, biweekly, monthly reports by using MS Excel, MS Access, SQL, and UNIX
  • Wrote Base SAS Programs for loading data from flat files to Teradata tables
  • Imported the spreadsheet data into the SAS using Proc Import procedure then generate the reports
  • Proc Export procedure is used for exporting the data from SAS to spreadsheets
  • Applied simple statistical procedures such as PROC MEANS, PROC FREQ for analyzing data
  • Developed SAS Program for Converting Large volume of Text File into Teradata Tables by importing the text file from Mainframes to Desktop
  • Extracted data from existing data source and performed ad-hoc queries by using SQL and UNIX
  • Generated reports for various departments Tele Marketing, Mailing, New Accounts by using SQL, BTEQ and Unix
  • Worked in generating graphs using MS Excel Pivot tables
  • Performing in-depth quantitative analysis or data analysis
  • Run periodic reports such as partner report and credit report for project managers
  • Analyzed and specifying specific data requirements
  • Proficient in importing/exporting large amounts of data from files to Teradata and vice versa
  • Worked on VBA Macro to create Weekly reports
  • Developed OLAP functions like sum, count, csum etc
  • Proficient in working on Set, Multiset, Derived, Volatile Temporary tables
  • Proficient working in loading data into staging tables via views
  • Designed and developed weekly, monthly reports related to the marketing and financial departments using Teradata SQL
  • Experience in Power Point Presentation (Financial Data, Charts)
  • Supported the Testing teams with data access and testing needs
  • Documented the processes at high level and in detailed manner
  • Participated in Daily and Weekly Team Meetings

Environment:: Teradata, BTEQ, VBA, SAS, FLOAD MLOAD, UNIX, SQL, Scrum, Business Objects, Windows XP, SAS.

Confidential

Data Analyst/Business Analyst

Responsibilities:

  • Designed and implemented data models in R around new data sources to analyze customer call tariffs
  • Gathered data requirements around new data sources and new uses for existing data sources
  • Visualized and reported the data findings using D3.js and appropriately provided insights to the IT team
  • Participated on cross-functional project teams to identify critical requirements from a reporting perspective
  • Acquired data from multiple sources and prepared data for further analysis
  • Worked on customer data related issues and worked on project resolution in collaboration with development teams
  • Involved in discussions with business partners to identify questions for data analysis and experiments
  • Met with SME to gather information, designed/created process maps; process flows, swim lanes using Visio
  • Responsible for managing the process, technology changes and implementing agile methodology
  • Used Excel Sheets, flat files to generate tableau ad-hoc reports
  • Generated Tableau dashboards for sales with forecast and reference lines
  • Worked with business managers/leaders; reviewed business and system requirements
  • Acquire data from primary or secondary data sources and maintain databases/data systems
  • Identify, analyze, and interpret trends or patterns in large data sets
  • Utilization of Power BI functions and Pivot Tables to further analyze in given complex data
  • Understanding the Functional Design Specs and preparing the Technical design
  • Involved in designing and developing High level and Low-level designs
  • Worked Extensively on Tableau tools -Repository Manager, Designer and Server Manager
  • Involved in Extraction, Transformation and Loading (ETL) Process
  • Created and Monitored Batches and Sessions using Tableau Power Center Server
  • Tuned the mappings to increase its efficiency and performance
  • Used Tableau Workflow Manager to create workflows
  • Workflow Monitor was used to monitor and run workflows
  • Developing Unit Test Plans thoroughly covering all the business scenarios
  • Worked on enhancement activities related to GL accounts
  • Involved in unit testing, systems testing, integrated testing and user acceptance testing
  • Prepared documents related to various HELOC, HELOAN, Reverse Mortgage, SEMAX applications
  • Configure the applications using Microsoft Office Suite (Word, Excel, Outlook, PowerPoint, Project, Visio) to meet a client requirement

Environment:: Oracle, XML, SQL, SAS, R, SPSS, SSRS Metadata Hub, Confidential Data Quality tool

We'd love your feedback!