Data Analyst Resume
2.00/5 (Submit Your Rating)
Denver, CO
SUMMARY
- A highly skilled 8+ years in Data Analyst of multi - disciplined Data analysis expertise encompassing Life Insurance/Annuity.
- Experience in developing and deploying projects for Predictive Modelling, Data Validation, Data Acquisition, Predictive modelling, Data Visualization and Data Mining wif large datasets of Structured and Unstructured data.
- Expert in the Annuities, Life Insurance, Disability, and Supplemental Insurance policies include administration, sales, customization, claims, pensions and CMS.
- Expertise in Data Analysis for Data Warehouse development for Online Transaction Processing (OLTP) and Data Warehousing (OLAP).
- Passionate about gleaning insightful information from massive data assets and developing a culture of sound, data-driven decision making.
- Solid ability to write and optimize diverse SQL queries, working knowledge of RDBMS like MySQL, SQL Server.
- Extensively worked in Functional modules of insurance like Life Insurance, Annuities and claims management.
- Experience in Data masking using Informatica data masking transformation and Strong experience in using Excel and MS Access to dump the data and analyse based on business needs.
- Expertise wifACORDstandard Life, Annuity data model and data collection requirements, form models and archival stipulations.
- Proficient in different types of testing that includes Black Box testing, Smoke testing, Functional testing, System Integration testing, End-to-End Testing, Regression testing User Acceptance testing (UAT)& Involved in Load Testing, Performance Testing & Stress Testing for Java and Python languages.
- Experience in Data Integration techniques like Data Extraction, Transformation and Loading ETL from disparate Data Source databases like Oracle, SQL Server, MS Access, flat files, CSV files and XML files into target warehouse using various transformations in Informatica, Data Stage, ODI.
- Experienced in generating and documenting Metadata while designing OLTP systems environment.
- Proven experience of building multi-instance re-usable ETL jobs for various projects and actively participated in client meetings to understand requirements and create ETL Technical Design and Mapping documents.
- Proficient in Data Analysis wif sound knowledge in extraction of data from various database sources like MySQL, Oracle, SAP BI and other database systems.
- Worked in creating different Visualizations in Tableau using Bar charts, Line charts, Pie charts, Maps, Scatter Plot charts, Heat maps and Table reports.
- Experienced in working wif Agile-based SDLC (Scrum) and Waterfall software development life cycle.
- Strong knowledge and understanding of Life Insurance wif Annuity Life Insurance, Whole Life, Universal life.
- Good understanding and hands on experience in setting up and maintaining NoSQL Databases like Cassandra and HBase.
- Extensively worked on statistical analysis tools and adept at writing code in Advanced Excel, R, MATLAB, Python, and SAS.
- Experience wif Teradata utilities such as Fast Export, Molad for handling various tasks.
- Proficient in design and development of various dashboards, reports using visualizations like bar graphs, scatter plots, pie-charts, geographic visualization and other, making use of actions, other local and global filters according to the end-user requirement.
- Expert in Data stage admin activities to create projects, migrate ETL code, create environment variables, parameter sets and assign users.
- Experienced in data warehousing concepts like star schemas, snow flake schemas, normalization, de-normalization, relational data structure, non-relational data structure.
- Excellent understanding in Agile and Scrum development methodologies.
- Experience in data scaling, wrangling and data visualization in R, Python, SAS and Tableau
- Proficient in R data wrangling and manipulating, functions
- Presented presentations on Data visualizations in R that included scatterplot, histograms, bar and stacked bar charts, boxplots, correlogram, heat map and density map
- Passionate in learning new Technologies, Enthusiastic, Determination, and Persistence to Troubleshoot issues, and Positive Attitude towards Challenging Work.
- Highly motivated team player wif excellent Interpersonal and Customer Relational Skills, Proven Communication, Organizational, Analytical, Presentation Skills, and Leadership Qualities.
PROFESSIONAL EXPERIENCE
Data Analyst
Confidential -Denver, CO
Responsibilities:
- Facilitated Joint Requirement Planning (JRP) sessions wif SME's in understanding the Requirements pertaining to Loan Origination to Loan Processing.
- Very strong knowledge of Commercial lines Property and Casualty Insurance including both policy and claim processing and reinsurance.
- Analysis of business requirements and system requirements of the client and also for Property and casualty modules
- Involved in creating and maintain tables and updating them according to the changes requested by the business in the data to SQL Server.
- Extracted the Business Requirements from the end users keeping in mind their need for the application and prepared Business Requirement Documents (BRD) using Rational RequisitePro.
- Reviewed Design Documents, Requirements Analysis Specifications wif Project Lead, Business Owners and the Technical Lead.
- Responsible for the development of a Data Warehouse for personal lines life insurance and other annuities.
- Documented all data mapping and transformation processes in the Functional Design documents based on the business requirements.
- Hands on Data stage administration by creating ETL projects/migrating code/setting up environmental variables.
- Ensure successful delivery of final content by the testing deadline including test sign off reports, any outstanding defects and how they would be handled, and hand off to UAT.
- Involved in creating and managed Project Templates, Use Case Project Templates, Requirement Types and Traceability Relationships in RequisitePro.
- Involved in creating automated Test Scripts representing various Transactions, Documenting the Load Testing Process and Methodology. Created meaningful reports for analysis and integrated the Performance Testing in the SDLC.
- Created UAT Test Cases to validate end-to-end business flows in Data Driven testing environment.
- Used ER Studio to create logical and physical data models for enterprise wide OLAP system.
- Worked on data modeling and produced data mapping and data definition documentation.
- Participated in creating Informatica ETL mappings for new Marketing tables whenever needed.
- Used Data warehousing for Data Profiling to examine the data available in an existing database.
- Involved in Developing and managed Project Plans and Schedules. Managed resolution of Project issues and conflicts.
- Performed custom data conversions and developed custom software for risk management application life insurance, etc., tracking. Perform all data field mappings
- Wrote Business Cases for the Stakeholders to understand the functioning of the new Underwriting System.
- Involved in creating Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
- Analyzed system performance and initiated process improvement measures for mainframe and Web-based applications.
- Facilitate Apollo formatting and Java scripting for efficient and accurate interfacing wif back office accounting system.
- Played a key role in mapping of fields in Life Insurance application to New Business Submission Transaction in the ACORD data model.
- Data cleansing and creation of layouts and specifications for end-user reports, delivering an end-to-end Metadata Management solution in the process.
- Involved in creating Data Flow Diagrams for Database of the Investment Portfolios
- Queried database using SQL for backend testing.
- Conducted Functional Walkthroughs, User Acceptance Testing (UAT), and supervised the development of User Manuals for customers.
- Worked as a User/Customer advocate and negotiated wif user as well as wif developers and management staff to resolve any requirement conflict to bridge the Gaps between IT and Business.
- Various versions of the documents generated during the project were maintained and managed using Rational ClearCase and performed defect tracking using Rational ClearQuest.
- Generated test reports using Quality Center
- Assisted the Scrum Master to prepare and monitor burn-down charts for the successful completion of tasks wifin the time-boxes.
- Advising Java programmers of any corrections needed to programming for user and data capture needs. Document all bugs in programming for developers to correct. Update and modify Access Database that supports Java scripts.
- Handled performance requirements for databases in OLTP and OLAP models.
Data Analyst
Confidential - Boston, MA
Responsibilities:
- Championed in analyzing end-user requirements and preparing the data to facilitate analysis (ETL), collected data from Informatica warehouse, build a pipeline to get the required data and performed transformations like aggregating different sources, filtering data and mapping to target database.
- Collaborated wif data engineers and operation team to implement ETL process, wrote and optimized SQL queries to perform data extraction to fit the analytical requirements.
- Involved in Designing web crawler to collect customer related data, tweets and stored them in a Json file.
- Involved in Developing PySpark modules for machine learning & predictive analytics using HDFS stored data.
- Involved in Data mapping specifications to create and execute detailed system test plans. The data mapping specifies wat data will be extracted from an internal data warehouse, transformed and sent to an external entity.
- Prepared and reviewed UAT test artifacts (Test Plans, Test Scripts, Traceability matrices, etc.) wif Project Stakeholders.
- Performed data pre-processing, normalization, feature scaling, removed duplicate rows, outliers, aggregated tables and stored in Pandas Data frames.
- Involved in Analyzing data using SQL, Python, R, Scala, Apache Spark and presented analytical reports to technical teams.
- Worked on Life Insurance and Annuity applications wif emphasis on new business and policy administration systems.
- Collaborated wif the Scrum Master and Scrum teams in Estimation and Velocity planning. Also, participated in conducting Sprint Planning, Daily Stand Up, Sprint Review & Retrospective meeting.
- Worked wif text feature engineering techniques for textual data: engrams, word2vec, TF-IDF using NLTK, Genism libraries of python.
- Used Text Mining and NLP technique, LDA for topic modelling and extracting relevant features to use it for analyzing the sentiment of user on different products offered by organization.
- Built a text classification model using classical machine learning models such as Logistic regression, SVM, KNN, Random Forest, Ensemble methods and deep learning methods like CNN, RNN, LSTM on the dataset for sentiment analysis
- Segmented the customers based on demographics using K-means Clustering.
- Python, as well as Rare used for programming and constant improvement of the model.
- Used cross-validation to test the model wif different batches of data, tuned the parameters to find the best parameters for the model and optimized, which eventually boosted the performance.
- Worked wif different performance metrics such as F1 score, Accuracy, ROC, precision, recall, log-loss error.
- Conducted User Acceptance Test (UAT) wif users and customers and wrote issues log based on outcome of UAT.
- Involved in documenting the complete process flow to describe program development, logic, testing, and implementation, application integration, coding.
- Have worked responsibly for converting for Mutual Life Insurance, the Fixed and Variable Annuities PC administrative system to the mainframe Vantage.
- Decided on state-of-art machine learning model, deep neural networks as it outperformed the ensemble methods.
- Involved in Data modeling (Logical Modeling and Physical Modeling), Data Mapping, Data Conversion, Data Validation, Data Profiling, Data Migration, and Schema Design for Enterprise Data Warehousing.
- Involved in Designing visualizations using Tableau that drove performance and provide insights, from prototyping to production, deployment, product recommendation, and allocation planning.
- Presented Dashboards to Higher Management for more Insights using Tableau, Power BI
- Extensively worked on data cleaning, and ensuring data quality, consistency, integrity using Pandas, NumPy.
- Used data quality validation techniques to validate the data and identified many anomalies.
- Performed unit and system testing to validate the output of above data wrangling techniques against the expected results.
- Performed K-means clustering to identify outliers and classify unlabeled data.
- Worked wif sales and marketing team to collaborate and frame the problem, answer important data questions, prototyping and experimenting ML/DL algorithms on the available data and finally integrating into a production system for different business needs.
- Worked wif Amazon Web Services (EC2/S3) cloud services to do machine learning on big data and integrate wif visualization (Tableau) for designing dashboards.
- Involved in Designing built and deployed a set of python modelling APIs for customer analytics, which integrates multiple machine learning techniques for various user behavior prediction and supports marketing segmentation programs.
- Involved in Developing a recommendation system by thorough research for providing the best set of products to the customer there by increasing the profit to the company
Data Analyst
Confidential - Confidential, PA
Responsibilities:
- Worked for Version document wif SME. Primary Liaison between Business User and Development Team.
- Involved in creating source target mapping (Data Mapping) document as per the design needs.
- Extensive experience in all phases of RUP and SDLC processes
- Worked in mainframe environment and used SQL to query various reporting databases.
- Deep understanding of Structured System Development Methodologies.
- Evaluated Homeowners, Automobile and Umbrella policies for new and renewal business.
- Assisted Project Manager and User managers to conduct Market Research and Feasibility Studies, and to develop Scope/Vision Documents.
- Responsible for creating UAT environment; ensure that all the test case input sources are available; test case output results were documented.
- Developed multi-instance re-usable ETL jobs, which can be used across different school districts using run time parameters.
- Requirements were gathered from the seven business functions wifin the Data Management organization.
- Worked extensively wif MS Excel and MS access.
- Used XML for building and parsing of Application Configuration file.
- Used Castor to create objects from XML documents.
- Involved in Creating and maintain data model/architecture standards, including master data management (MDM)
- Develop Logical and Physical data models that capture current state/future state data elements and data flows using Erwin.
- Responsible for acting as a liaison wif operational and financial management, systems support and sales to identify business opportunities. Developed a file of ACORD Forms used as the standards in all Property and Casualty markets, for both Personal and Commercial Lines of Business.
- Involved in Designing DataStage ETL jobs for extracting data from heterogeneous source systems, transform and finally load into the Data Marts.
- Conducted Gap Analysis, and Gathered User Requirements by Interviews, user meeting, JAD session, and Requirement Elicitation Sessions
- Collected and documented business processes as well as Business Rules and then translate to the Business Flow Chart
- Performed Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data
- Assist other project teams wif data analysis, data mining and data profiling as needed.
- Wrote PL/SQL statement and stored procedures in Oracle for extracting as well as writing data.
- Involved in Designing and developed Use Cases using UML and Business Process Modelling.
- Involved in Developing QA Test Plan, test conditions and test cases, as well as assisted wif QA team in System Testing using Mercury Test Director
- Involved in Developing User Manual and Provided User Training.
- Conducted User Acceptance Testing
Business Data Analyst
Confidential -NYC, NY
Responsibilities:
- Captured of Business Features and delivery of Business Requirements Document.
- Captured of Functional Requirements and delivery of Functional Requirements Specification.
- Captured of Non-functional Requirements and delivery of Non-Functional Requirement Document.
- Expertise in using the testing tools such as Mercury Quality Center (QC)/Test Director and Quick Test Pro (QTP).
- Gathered requirements and design of data warehouse and data mart entities.
- Involved in designing DataStage ETL jobs for extracting data from heterogeneous source systems, transform and finally load into the Data Marts.
- Captured of User Interface Requirements and delivery of User Interface Requirements Document.
- Captured of Reporting Requirements and delivery of Reporting Requirements Document.
- Captured and Formulation of Business Rules and delivery of Business Rules Templates
- Met wif client groups to determine project scope, requirements and plan. Utilized Rational Unified Process (RUP) to configure and develop process, and create BRDs.
- Analyze business requirements and segregated them into high level and low-level Use Cases, Activity and Sequence using Rational Rose according to UML methodology.
- Derived Functional Requirement Specifications based on User Requirement Specification. Understand and articulate business requirements from user interviews and then convert requirements into technical specifications.
- Involved in Creating interface agreements for new data acquisitions from Provider database NDB.
- Logged all information on various processes on documents for future reference.
- Monitored project milestones and critical dates to identify potential risk to project schedule; identified ways to resolve scheduled issues; kept management up-to-date on any changes.
- Assisted in UAT and resolved issues logged by UAT test users.
- Wrote SQL Queries to test the transformation rules.
- Validated table layout, keys, data in the backend using PuTTY.
- Worked on Agile methodology on all the phases of SDLC (Software Development Life Cycle)
- Performed cross-functional activities as required and creating tickets in HP ALM.
- Worked on Report generation for Managed Care and representation using Power BI.
- Worked wif Data Modelers/ Data Architects to integrate source table to target layer.
- Extensively used tools in Hadoop Ecosystem including PIG, HIVE, HDFS, MAP REDUCE, SQOOP.
- Defined data format for the attributes to map the data from source to target.
- Involved in development tasks including control table entries and other ETL tasks.