We provide IT Staff Augmentation Services!

Sr. Data Analyst Resume

4.00/5 (Submit Your Rating)

Minneapolis, MN

SUMMARY

  • Over 8+ Years of IT experience as Data Analyst with solid understanding of data warehouse/data mart development, SQL and analysis of Online Transactional Processing (OLTP), data warehouse (OLAP) and Business Intelligence (BI) applications In Insurance Industry .
  • Experienced in AzureSQL,Azure Data Lake,Data Lake Analytics,Data Lake Factory.
  • Experience in developing and maintaining logical and physical Data models for Data Warehouse applications using tools like Erwin and ER/Studio.
  • Extensive experience in RDBMS implementation and development using SQL, PL/SQL stored procedures and query optimization.
  • Experienced in Software Development Life Cycle methodologies (Waterfall Model, Iterative, Agile SCRUM)
  • Extensive experience in Text Analytics, developing different Statistical Machine Learning, Data Mining solutions to various business problems and generating Data Visualizations using R and Python.
  • Experienced in working with, Equities, Equity Derivatives ETFs, Options, Bonds, MBS, Repos, Reverse Repos, Liquidity Risk, Commodities, Basel, Regulatory Reporting, Municipal Bonds, FIX, Order Routing, Data Feeds, Trade flow, Trade lifecycle, Trading, FINRA.
  • Experience on trading system and trade life cycle - trading order initiation, pre-trade compliance checking, trade execution, clearing, settlement, settlement position reporting, position risk reporting custody and client reporting.
  • Experienced withACORD Standards, Implementing, andMapping, Documentingvarious projects, across diverse property and casualty insurances lines of business.
  • Experience in Designing and implementing data structures and commonly used data business intelligence tools for data analysis.
  • Experience in Dimensional Data Modeling using Star and Snow Flake Schema. Ensure appropriate processes and controls across major process areas, including but not limited to data quality, ETL and Meta data management.
  • Experience in working with business intelligence and data warehouse software, including SSAS/SSRS/SSIS, Business Objects, Amazon Redshift, Azure Data Warehouse and Teradata.
  • Vast experience of working in the area data management including data analysis, gap analysis, data mapping, data validation along with data profiling, data cleansing, data scrubbing and reference of data reports utilizing Tableau Visualizations like Dual Axis.
  • Expertise in OLTP/OLAP System Study, developing Database Schemas like Star Schema and Snowflake Schema used in relational, dimensional and multidimensional modeling.
  • Extensively workedData Governance, i.e.Metadata management, Master data Management, Data Quality, Data Security
  • Worked with Amazon Web Services (AWS) for a multitude of applications utilizing the Amazon Web Services focusing on high-availability, fault tolerance and auto-scaling.
  • Experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration Management.
  • Experienced with Data Quality Management, Metadata Management, Master Data Management, Process Modeling, Data Dictionary, Data Stewardship, Data Profiling, Data Quality, and Data Model Standards.
  • Experience in analysis, design, development, implementation and troubleshooting ofDataMart /DataWarehouse applications using ETL tools like SSIS, OLTP, OLAP, Business Objects, Tableau, Informatica power center, Power BI.
  • Experience in Property and Casualty Insurance including Auto, Home, and Marine Life insurance/Annuities.
  • Worked as a Data Analyst using the iterative Software Development Lifecycle (SDLC) principles of waterfall, AGILE (Scrum Framework)
  • Strong Knowledge ofBPM and BPMNand Creating dishoarding and data visualization skills using Excel, Tableau, or similar tools
  • Worked with Data Analyst on SQL data tables, statements, reports for validation of household participants mutual fund and annuity investments
  • Experience in extracting, transforming and loading (ETL) data from spreadsheets, database tables and other sources using Microsoft SSIS and Informatica.
  • Experience in designing Star schema, Snowflake schema forDataWarehouse, ODS architecture
  • Deep technical background and solid working knowledge of SQL Server Data warehouse systems.
  • In-depth knowledge of Rational Unified Process (RUP); risk engineering; data modeling and mapping; and design using UML, Rational Rose, Visio and other related tools.
  • Experience in the Property, Casualty, Annuities, Disability, and Supplemental Insurance policies include administration, sales, customization, claims, pensions and CMS.
  • Created SSIS packages to loaddataintoDataWarehouse using Various SSIS Tasks like Execute SQL Task, bulk insert task,dataflow task, file system task, send mail task, active script task, xml task and various transformations
  • Demonstrated ability to work actively in different phases of SDLC, Agile/Scrum in teams, fostered cooperation and collaboration among individuals in the work unit and helped team resolve conflicts constructively.
  • Experience working in RDBMS & Data Warehouse projects.
  • Experience in SQL for database management,data migration and database validation (SQL Server, Postgres, Teradata).
  • Expert at Risk Management by Risk Identification, Impact Assessment, Prioritization Analysis, Risk Tracking and Mitigation with monitoring of whole.
  • Comprehensive understanding and experience of SDLC and project life cycle methodologies such as Water fall, Agile, Rapid Application Development (RAD), and Rational Unified Process (RUP).
  • Strong experience in testing using HP Quality Center.
  • Experience in Automation tools like Rational RequisitePro and Quick TestPro.
  • Expertise in Database Administration of Test Environment Databases, Data Pools and identifying performance issues due to inefficient Database Administration.
  • Excellent writing skills in generating, Project Charter, Project Plan,BusinessRequirements documents, Use Case Specifications Test Plan, Credit Risk Management Plan Liquidity risk assessment, and Publishing Regular Reports with status and metrics.
  • Expertise in Database Querying and writing T-SQL Queries

TECHNICAL SKILLS

Tools: JIRA, Team Foundation Server, IBM Cognos, Tableau, Informatica, SSIS (SQL Server Integration Services), SSAS (SQL Server Analysis Services), SSRS (SQL Server Reporting Services), MS Excel (Pivot Tables, V-lookups), MS Word, MS Visio, SharePoint, MS Project, MS PowerPoint

BusinessSkills: Change Management,BusinessProcess Re-engineering, Use Case Modeling, JAD Sessions, Requirements Workshops, Gap Analysis, SWOT Analysis, Impact Analysis, Cost benefit Analysis, NPV and ROI Analysis,BusinessProcess Analysis, Capital Budgeting, Project Planning, Scheduling and Budgeting Database MySQL, Oracle, Vertica

PROFESSIONAL EXPERIENCE

Confidential, Minneapolis, MN

Sr. Data Analyst

Environment: MS Office Tools, Windows XP, MS Project, SQL, Agile, Charles River IMS,SWIFT, Quick Test Pro, Requisite Pro, MS Visio, Clear Case, Visual Studio, MS-SharePoint

Responsibilities:

  • Used Web inspect for performing Automated scans of online applications in production followed by report presentation.
  • Reviewed the data model and reporting requirements for Cognos Reports with the Data warehouse/ETL and Reporting team.
  • Analyzed existing trade lifecycle from client initiation to post-trade compliance including coordination with broker-dealers.
  • Implemented project using theAgileMethodology based on iterative and incremental development, where requirements and solutions evolve through collaboration with cross-functional teams and produce artifacts inSoftware Development Life Cycle(SDLC) Phases.
  • Involved inData Validations, Data Interfaces, Data Cleansing, Data Mappings and Data Conversionsof bothSAP Master DataandSAP Transactional Data.
  • Designs and develops the logical and physical data models to support the Data Marts and the Data Warehouse
  • Created Use Cases, Activity Report, Logical Components to extractBusinessProcess Flows and workflows involved in the project using Rational Rose, UML and Flow Chart Tools such as MS Visio.
  • DevelopedDataMigration,DataProfiling and Cleansing rules consideringdatalinkage anddatatransformation rules
  • Worked on Data Mining and data validation to ensure the accuracy of the data between the warehouse and source systems.
  • Create Data Quality Scripts using SQL and Hive to validate successful data load and quality of the data. Create various types of data visualizations using Python and Tableau.
  • Developed a file of ACORD Forms used as the standards in all Property and Casualty markets, for both Personal and Commercial Lines of Business.
  • Designed the software systems to enter the trades performed on the DB DSS systems into the CRTS for Pre Trade Compliance and execution.
  • Created programs in Python for automating the processes for creating Excel sheet reading data from Redshift databases.
  • DevelopedDataMapping andDataTransformation documents between legacy database, Production DWH, and front end applications
  • Involved in modeling (Star Schema methodologies) in building and designing the logical data model into Dimensional Models.
  • Utilized Charles River (CRD) affirmation/confirmation for both fixed income and equity clearing and settlement.
  • Write and execute positive and negative test cases to ensure the data originating from the data warehouse Oracle dB is accurate through to the SQL dB in the applications.
  • Customized and developed new metrics and calculated metrics of GL, BS, P&L, Budget, Asset Mgmt on Business Model & Mapping Layer and Presentation Layer of Oracle Analytics repository upon Client requirements And deployed it on the BI server.
  • Documented the source to target mappings for both data integration as well as web services
  • Experience working with MDM team with various business operations involved in the organization
  • Used advanced Excel features like Excel Charts, Excel Formulas, Excel Pivot Charts, Excel Dashboard and Excel Pivot Tables for data analysis and visualization in order to help the end users get a better understanding of the information being communicated.
  • Elicited the requirements to understandbusinessrules,businessprocess,businessflow and follow the SDLC standard of thebusinessin project meeting
  • Responsible for executingVolume Testing, Compliance Testing, Data Quality Testing, Unit Testing, Regression Testing, Acceptance Testing and Stress Testing.
  • Extensively involved in ETL code using Informatica tool to meet requirements for extract, cleansing, transformation and loading of data from source to target data structures.
  • Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards, data types, volumetric, domain definitions, and corporate meta-data definitions.
  • Design and develop SQL database scripts, develop process to manage database DDL & DML changes.
  • Used DOORS to link requirements to test cases,businessrules, and design items to maintain traceability.
  • Wrote SQL Queries to define, Identify and validate the code written for the data movement into the database tables.
  • Provide metrics and business analysis and recommendations to executive leadership to support decisions affecting sales compensation.
  • Involved annuity performance data process.
  • Responsible for data mapping of feeds from several reference data systems, including Bloomberg, to several in-house applications.
  • Conducted testing to validate content, integrity, format anddatamodel changes duringdatamigrationfrom source to target systems
  • Involved indatamigrationfrom staging to integration.
  • Used MS-Visio to document current work flows, manual processes and end-to-end processing of system interactions.
  • Assisted and executed associated project plans and test scripts for User Acceptance Testing (UAT) and subsequent user training.
  • Documented and delivered Functional Specification Document to the project team.

Confidential, Cary, NC

Data Analyst

Environment: MS Visio, MS Project, MS Office Suite (Word, PowerPoint, Access, Excel), MS SharePoint, JIRA, Quality Center, MS SQL Server, Agile, Charles River IMS, Oracle Server, NICE Actimize, Tableau, Spots fire

Responsibilities:

  • Participated in data modeling, data analysis and requirement documentation.
  • Create debit invoice and payable invoice foraccountingteam to balance amount differences.
  • Worked withprojectmanagement, business teams and departments to assess and refine requirements to design/develop BI solutions using MS Azure.
  • Responsible for facilitating Enterprise definition meetings with Life and Annuity SME's and project Data Analysts.
  • Using MS-Visio analyzedbusinessrequirements and process through Use Cases, Class, Sequence, and Activity diagrams, and adapted UML standards to define modularized Data Process Models
  • Experience with developing User Stories throughout the Agile Lifecycle aswellas created UML diagrams such as Use Case, Activity, ER diagram and Chart diagrams with MS Visio.
  • Performed client data gap analysis and documented requirements to fulfill VM requirements from a pre-trade and post-trade compliance perspective
  • Organized and maintained all company incentive compensation plans, ensuring they were documented, current and had the proper approvals.
  • Worked on Modeling and designing Database for data warehouse workload and created the objects in redshift
  • Wrote Python scripts to parse XML documents and load the data in database.
  • Performed Data Profiling and extensive data validation.
  • Created technical impact documents, data mapping documents and defines ACORD mappings for various interfaces based on knowledge of business needs and working closely with senior department managers and key department members with limited supervision.
  • Key liaison for the business in identifying the enhancement requests and writing specs for Charles River/Eagle and other interface projects including configuration changes, software patches, and upgrades.
  • Worked on project life cycle and SDLC methodologies including RUP, RAD, Waterfall and Agile.
  • Applied data cleansing/data scrubbing techniques to ensure consistency amongst data sets. Independently perform complex troubleshooting, root-cause analysis and solution development.
  • Worked with Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic.
  • Created new user guides in Confluence for corporate actions, model analysis, model validation, pre-trade compliance and Trade Blotter
  • Worked with Data Conversion Team(BODS), Data Quality Team(Cleansing using InfoSteward), Data Governance.
  • Involved inData Modelingusing Star/Snowflake Schema Design using Erwin.
  • Created Source to target Data mapping document of input /output attributes with the proper transformations which would help for the easy development of SSIS bridge.
  • Proficient in writing Test Plans, Test Scripts, Test Scenarios, Test Cases and experience in executing Integration, System, Regression and User Acceptance testing.
  • Created acomplex report/dashboardusing advanced excel andTableauand performed data analysis on large dataset to solve business issues.
  • Implemented metadata standards, data governance and data stewardship, master data management(MDM), ETL, ODS, data warehouse, data marts, reporting, dashboard, analytics, segmentation, and predictive modelling
  • Created BPM documents for risk and compliance processes using visio
  • Reverse engineered the reports and identified the Data Elements (in the source systems), Dimensions, Facts and Measures required for new enhancements of reports.
  • Conduct Design discussions and meetings to come out with the appropriate Data Mart at the lowest level of grain for each of the Dimensions involved.
  • Designed a STAR schema for the detailed data marts and Plan data marts involving confirmed dimensions.
  • Created and maintained the Data Model repository as per company standards.
  • Conduct Design reviews with the business analysts and content developers to create a proof of concept for the reports.
  • CreatedDatamodel and Solution design documents and Technical design documents.
  • Performed ETL processes; scrubbed data; and performed data mapping using MS Excel and MS Access to ensure data quality and workflow processes were in compliance with the new government mandated policies and standards
  • Performed extensive data analysis using SQL for requirements and troubleshooting purposes.
  • Develops reports, charts, tables, graphs, and intermediate statistical analysis using tools such as SAS, SQL, Tableau, and MS Excel.
  • Facilitate data-based decision-making by reporting trend and compensation analyses based on growth initiatives.
  • Used Curamframeworkto drive requirements, process flow documents, and subsequent design estimation. Used Query Analyzer, Execution Plan to optimize SQL Queries. writing SQL and PL/SQL statements - Stored Procedures, Functions, Triggers and packages.
  • Worked withDataWarehouse in the development and execution ofdataconversion,data cleaning and standardization strategies and plans as several small tables are combined into one singledatarepository system MDM (MasterDataManagement).
  • Analyzed thedataflow of the complex business systems anddatamigrationrequirement, helped technical staff test and improve the performance of thedatamigrationengine.
  • Followed the SCRUM Project Management Agile Methodology for the entire SDLC.
  • DevelopedBusinessrequirements, GAP analysis, use case diagrams and data flow diagrams.
  • Wrote SQL queries to understand data relations and variances from other source systems.
  • Developed various reports for user verification like Cross Tab Reports and Sub Reports, various charts and graphs like Bar chart, line graphs, and Pie charts by using Crystal Reports and exported reports into formats like PDF, HTML, Excel, Word and RTF.
  • Created product documentation including online help, printed user manual, and training materials.

Confidential, Charlotte, NC

Data Analyst

Environment: s: HP Quality Center, Agile, Charles River IMS,ETL,SDLC,HTML, Oracle Fusion Middleware, MS Visio, Mercator, SharePoint, Alteryx, Agile, Instream, BizTalk, MS office, Teradata, (Defect Management Tool), SQL, VB script.

Responsibilities:

  • Performed extensive backend testing using SQL queries to return the data from database and check the data to display in UI.
  • Conducted User Acceptance Testing, gathered and documented User Manuals and Business Rules.
  • Primary responsibilities included developing a new Treasury Fixed Income BI database and Advanced Excel financial models to identify the potential risk of exposures of the fixed income positions being analyzed.
  • Proficient using multiple systems/databases for payroll and policy information research for Confidential Team
  • Coordinate requirements elicitation to extractbusinessneeds by liaising with key stakeholders
  • Analyze user requirements, procedures, and problems to automatebusinessprocess or improve existing systems
  • Created, documented and maintained logical and physical database models in compliance with enterprise standards and maintained corporate metadata definitions for enterprise data stores within a metadata repository.
  • Assisted in writing test case scenarios and performed Claims Unit testing, Regression testing, Integration testing and Compliance testing.
  • Assisted in the data mapping to Charles River IMS, to create alerts, warnings and pre/post trade compliance rules from the enhanced system.
  • Performed financial data analysis, data mapping, data conversion, data auditing and aggregation of funds involved in Alternative Asset Investment
  • Analyzed batch process, and setting up CRD/CRIMS (Charles River Trading Systems) import/export configuration for DoFeeds, and DoExport scripts, and other EOD processes like cancel day orders, calcBond, compliance.
  • Developed business requirements document for implementing the pre-trade and post-trade compliance rules on Longview Compliance platform.
  • Analyzed the data required to support all the pre-trade and post-trade compliance rules and created detail functional and technical specifications for the technology team to build various data interfaces from the legacy applications to the new platform.
  • Created pre-trade compliance rules for account and security level restrictions based on the client guidelines.
  • Used MS Word & Visio to documentdataflow of the AS IS process and TO BE process.
  • Analyzed source databases and createddatamapping documents for thedatamigration process.
  • Participated in project management, design, requirements definition, specification, budgeting, control, and restructuring of organizations
  • Design and develop the data load process using XML Style sheets and ACORD.
  • Responsible for Data Cleaning, features scaling, features engineering by using NumPy and Pandas in Python.
  • Conducted Exploratory Data Analysis using Python Matplotlib and Seaborn to identify underlying patterns and correlation between features.
  • Provide reporting, analysis and payout processing for data and apprentice teams within Managed Business Objectives Compensation
  • Executed Data quality checks in the form of SQL queries to ensure data integrity and enforce business rules.
  • Wrote artifacts that explained how to preserve existing garnishment volume reporting and improve report presentation in new system.
  • Executed all aspects related to employee health insurance, worker's compensation, life insurance, long/short term disability, employee discount/store benefits including company events coordination.
  • Involved in all phases of software development life cycle (SDLC) in RUP framework.
  • Identified, define, document and communicate thedatamigrationrequirements.
  • Preparing, reviewing and updating business requirement documents and user acceptance test plans.
  • Created burn down charts, sprint & release backlogs for effective project management.
  • Develop and manage deposits-related functional requirements.
  • Develop and elicit requirements for departmental and some corporate projects as requested by internal/external customers; preparebusinessand functional requirements.
  • Document all data mapping and transformation processes in the Functional Design documents based on the business requirements.
  • Extensively involved with Data cleansing, formatting of the data to correct the mismatch in Staging area.
  • Performed Data Profiling and extensive data validation to ensure report matches with existing mainframe Files.
  • Extensively involved in Data Extraction, Transformation and Loading (ETL process) from XML to the staging area and from Staging to ODS using Informatica Power Center.
  • Worked on daily basis with lead Data Warehouse developers to evaluate impact on current implementation, redesign of all ETL logic. Worked extensively with the ERwin Model Mart for version control.
  • Subject matter expertise (SME) assistance with life insurance and annuity products
  • Wrote SQL scripts to retrievedatafrom tables and views with like operator and other wild cards
  • Designed Test Strategy and Test Plan to ensure the features to be tested are documented.
  • Closely work with Project Manager and Coordinates the Testing tasks
  • Performed Incremental load with several Dataflow tasks and Control Flow Tasks using SSIS.
  • Adept at coordinating with stakeholders, Subject Matter Experts (SMEs) and end users to understand, analyze, communicate and validate requirements through User interviews, Joint Application Design (JAD), Joint Application Review (JAR) sessions.
  • Identify source systems, and designate data points files for annuities modeling and experience studies efforts
  • Map actuarial concepts (annuities valuation, reserving, reinsurance) to data model structures: identification of existing, or creation of new business terms their descriptions and sourcing
  • Analyze annuities’ accumulation and payout lifecycles: cash flows, terminations, selections, and utilizations
  • Using VizQL tools like Tableau to identify data trends and anomalies by creating visualization reports
  • DevelopDataQuality Management exception rules using SQL fordatacleanseand various loan codes and cope up with architectural changes
  • Created ETL packages utilizing SSIS to extractdatafrom heterogeneous sources such as Excel, flat files and network drives to staging area.
  • Used UML methods to develop use case diagrams, Swim Lane Charts, Process Workflow Diagrams, Data Flow Diagrams, &BusinessProcess Models (BPM) to represent the system workflow & capture bottlenecks in the application.
  • Performed AML report analysis to ensure the application development process is efficient for projects that include complex reporting, variance reporting, and dashboard reporting.

Confidential, Brentwood, TN

Data Analyst

Environment: RUP, CrystalReports, Rational Requisite Pro, Charles River Investment Management System (CRIMS) 9.x, MS Visio, SQL, Agile, Share point 2010, HP-ALM, UML, Microsoft Office, Windows XP/2007.

Responsibilities:

  • Provided implementation assessment for Rational Requisite Pro, Rational Rose, UML and RUP.
  • Identified and involved all key stakeholders, contributors,business, operations and technical resources that must participate in a project and ensure that contributors are motivated to complete assigned tasks within the parameters of the project plan.
  • Used SQL Queries in Oracle to pull out data from the databases for the data validation and routine report generation.
  • Perform data analysis, create dashboards, worksheets using Tableau
  • Subject Matter Expert on various databases used for the purchasing of indirect materials and adatamigrationanalyst.
  • Involved in the Business Blueprint of Data Management for MDM, Data Conversions, Interfaces, Data Quality and Data Archiving.
  • Extensive involved in Data Analysis, Data Cleansing, Requirements gathering, Data Mapping, Functional and Technical design docs, and Process Flow diagrams.
  • Design and develop the data load process using Informatica Power Center and XML Style sheets and ACORD XML.
  • Responsible for defining the naming standards for data warehouse
  • Generated ad-hoc SQL queries using joins, database connections and transformation rules to fetch data from legacy DB2 andSQL Serverdatabase systems
  • Exhaustively collected business and technical metadata and maintained naming standards
  • Worked closely with the ETL SSIS Developers to explain the complex Data Transformation using Logic
  • Created ETL Jobs and Custom Transfer Components to move data from Oracle Source Systems to SQL Server using SSIS,
  • Created swim lane process flow diagrams for Bloomberg AIM Order management system (Fixed Income) and Charles River Trading (Equity).
  • Involved in extensive DATA validation using SQL queries and back-end testing.
  • Wrote complex SQL, PL/SQL Testing scripts for Backend Testing of the data warehouse application and to check dataflow in different environments.
  • Led full SDLC, including scoping, planning & costing, business modeling, requirements creation, design, development, and implementation for a global STP trade order management system (OMS). Effort included three tiered architecture, pre-trade compliance, centralized product sourcing, and flow from OMS to accounting system trade capture.
  • Involved in OLAP model based on Dimension and FACTS for efficient loads of data based on Star Schema structure on levels of reports using multi-dimensional models such as Star Schemas and Snowflake Schema processing with integration through SWIFT messaging with counter parties through the complete settlement process.
  • Developed long termdatawarehouse roadmap and architectures, designs and builds thedatawarehouseframeworkper the roadmap.
  • Involved in loading the data from Source Tables to Operational Data Source tables using Transformation and Cleansing Logic
  • Created the conceptual model for the data warehouse with emphasis on insurance (life and health), mutual funds and annuity using Erwin data modeling tool.
  • Cleansed, extracted and analyzed business data on daily basis and prepared ad-hoc analytical reports using Excel and T-SQL
  • Used SQL, Python, Excel, and other tools to create reports and develop practical solutions to challenges for operations.
  • Datamodeling by ER diagrams and process modeling usingdataflow diagram in the design process.
  • Created Use Cases using UML and managed the entire functional requirements life cycle using Rational Unified Process, Requisite Pro.
  • Created ETL Informatica execution scripts for automating jobs.
  • Created and Designed logical database schema layout structure to prepare for ETL Informatica processes, data mining, extraction, Analysis and Reporting System.
  • Testpreandpost trade compliancerules across all assets classes and metrics such as Duration, Delta exposure and benchmark during migration fromCharles River IMSintoBlackrock Aladdin
  • Created recurring scheduling instances in Business object for the daily scheduling of the underlying WebI reports for Dashboard.
  • Gathering, reviewing business requirements and Analyzingdatasources from Excel/SQL for design
  • Created Process Work flows, Functional Specifications, and responsible for preparing Software Requirement Specifications (SRS), Functional Specification Document (FSD) and final Design Document as per SEI CMM standards.
  • Provided information security, compliance, risk advisory, and risk management services. Primary responsibilities included:
  • Involved in mentoring specificprojectsin application of the new SDLC based on the Agile Unified Process, especially from theprojectmanagement, requirements and architecture perspectives.
  • Created and managed project templates, Use Case project templates, requirement types and trace-ability relationships in Requisite Pro.
  • Involved in collaborating with ETL/Informatica teams to source data, perform data analysis to identify gaps
  • Developed ETL routines using SSIS packages, to plan an effective package development process, and design the control flow within the packages.
  • Provided inputs in the strategic development of detailed project plans, work assignments, target dates etc.
  • Developed strategies with Quality Assurance group to implement Test Cases in Mercury Test Director for stress testing and UAT (User Acceptance Testing
  • Utilized Power BI (Power View) to create various analytical dashboards that depicts critical KPIs such as legal case matter, billing hours and case proceedings along with slicers and dicers
  • Developed solid background and knowledge of Fixed and Variable Life and Annuity products, personal financial planning strategies for high net worth individuals, commissions and illustration systems.
  • Constructed effective workflows by producing use case diagrams in UML to model system functions and define complexbusinessprocesses.
  • Developed the best practice, processes, and standards for effectively carrying outdata migration.
  • Collected the data and analyze the metrics in updating the status with the team.
  • Created data flow diagrams, data mapping from Source to stage and Stage to Target mapping documents indicating the source tables, columns, data types, transformations required andbusinessrules to be applied.
  • Extracted large volumes of data feed from different data sources, performed transformations and loaded the data into various Targets.
  • Implemented numerous automated data collection systems utilizing MS Access and SharePoint, for reporting and analytics.
  • Worked extensively with Microsoft Excel (Macros, VLOOKUPS, and Pivot Tables). Proficient Microsoft Word, PowerPoint, Access.
  • Performed extraction, transformation and loading of data from RDBMS tables, Flat Files, SQL into Teradata in accordance with requirements and specifications.
  • Created product documentation, including online help, printed user manual, and training materials.

Confidential, Bloomington, IL

Data Analyst

Environment: scrum (agile), MS-Visio, .net, MY SQL, MS Word, MS PowerPoint, SharePoint, Excel, JAD, HP ALM, Web Services, HTML, XML, MySQL, Informatica, Agile, JIRA, Rational clear quest

Responsibilities:

  • Created and managed administration processes including workflows, validation rules, approval process, assignment rule, and list generation.
  • Created internal and vendor system P&C, Life, Annuity, Group product process maps for Policy Administration & Billing systems
  • Facilitated vendor (STG Mastek) gap analysis sessions for P&C, Life, Annuity, Group Direct and List Billing system with quality of service (QoS) parameters and web content management.
  • Defined and documented the Vision and Scope of the project. Conducted one on one interviews with high level management team and participated in the JAD session with the SME's.
  • Involved withdata profilingfor multiple sources and answeredcomplex business questions by providing data to business users.
  • Worked with datainvestigation, discoveryand mapping tools to scan every single data record from many sources.
  • Designed class and activity diagrams using Power Designer and UML tools like Visio
  • Involved in designing and developingDataModels andDataMarts that support the Business IntelligenceDataWarehouse.
  • Utilized corporation developed Agile SDLC methodology used Jira and Microsoft office software to perform required job functions.
  • Used Snowflake functions to perform semi structures data parsing entirely with SQL statements.
  • Involved in requirement gathering and database design and implementation ofstar-schema, snowflake schema/dimensional data warehouseusingER/Studio.
  • Performingdata management projects and fulfilling ad-hocrequests according to user specifications by utilizing data management software programs and tools like Perl, Toad, MS Access, Excel and SQL
  • Written SQL scripts to test the mappings and DevelopedTraceability Matrixof Business Requirements mapped to Test Scripts to ensure any Change Control in requirements leads to test case update.
  • Involved in extensive DATA validation by writing several complex SQL queriesand Involved in back-end testing and worked with data quality issues.
  • Designed Informatica ETL Mappings documents, created ETL staging area framework, created data mapping documents,dataflow diagram, ETL test scripts etc.
  • Identification of data conversion and translation strategies i.e. review, interpretation, and translation of requirements and use cases - Define, maintain and enforce the use of standardized enterprise data element names, abbreviations, definitions, characteristics, and domains.
  • Involved in various projects related toData Modeling, System/Data Analysis, Design and Development for both OLTP and Data warehousing environments.
  • Documented the Requirement Traceability Matrix for tracing the Test Cases and requirements related to them.
  • Conducted user interviews and documentedbusinessand functional requirements.
  • Participated in project management, design, requirements definition, specification, budgeting, control, and restructuring of organizations
  • Involved in Data mapping specificationsto create and execute detailed system test plans. The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Analyzed SQL stored procedures, functions and database architecture to support analysis efforts and aid in troubleshooting production issues.
  • Developed the required data warehouse model usingStar schemafor the generalized model
  • Used forward engineering approach for designing and creating databases for OLAP model
  • Conducted design walk through sessions with Business Intelligence team to ensure that reporting requirements are met for the business
  • DevelopedData Mapping, Data Governance, TransformationandCleansingrules for the Master Data Management Architecture involving OLTP, ODS and OLAP
  • Collaborated with ETL, BI and DBA teams to analyze and provide solutions to data issues and other challenges while implementing the OLAP model.
  • Developed and maintained data dictionary to create metadata reports for technical and business purpose.
  • Developed risk analysis reports using the data analysis software Tableau to analyze loan level information
  • Involved with all the phases of Software Development Life Cycle (SDLC) methodologies throughout the project life cycle.
  • Experience in SQL for database management,datamigrationand database validation (SQL Server, Postgres)
  • Developed and implemented SSRS reports with multi-parameter, chart, sub-report and interactive sorting capability for administration andaccountingdepartment.
  • Worked on migration of Treasury transformation project from Bloomberg to BlackRock Aladdin.
  • Utilized the Agile Scrum methodology to analyze the requirements to create stories
  • Performed Requirement Analysis and developed Use Cases, Activity Diagrams using Rational Rose
  • Project managed the creation of a database that records takes into account the contracted amounts of domain, and image storage capacity.
  • Involved in extensiveDATAvalidation by writing several complex SQL queries and Involved in back-end testing and worked withdataquality issues
  • Database modeling,dataanalysis, query building,datamanagement,datamigrationusing SQL server Management Studio and Toad for Oracle
  • Created project plans in accordance with project managementframeworkstandards and procedures. Identified ongoing issues and gaps within project plans, in order to eliminate implementation issues
  • Decomposed high-level information gathered frombusinessuser community into detailedBusinessRequirements, Functional Specifications Documents, Use Cases and User Stories.
  • Intricately involved inBusinessProcess Flow development,BusinessProcess Modeling, and ad hoc Analysis
  • Incorporated Rational Unified Process (RUP) to create Requirement Document Specifications using VisibleAnalyst.
  • Maintained meeting minutes, meeting agenda and communicated them with the SMEs and the stakeholders on timely manner using MS Word, MS PowerPoint and created reports using MS Excel.
  • To validate Data extensively wrote complex SQL Queries which mainly was hooked up with backend testing.
  • Checked the data flow from front end to backend and used SQL queries to extract the data from the database.
  • Responsible for collecting and analyzing the test metrics and then submitting the reports, which kept track of the status and progress of the testing effort using JIRA.
  • Developed strategies with Quality Assurance group to implement Test Cases in Test Manager for User Acceptance Testing.
  • Conducted walk-through and UAT testing for end user acceptance, deployment and subsequent roll out to verify whether all the User Requirements were catered to by the application.

We'd love your feedback!