We provide IT Staff Augmentation Services!

Sr. Ios App Developer Resume

4.00/5 (Submit Your Rating)

Irving, TX

SUMMARY

  • 8+ years of industry experience in Data Analysis, Data Warehousing, Business Intelligence, Data modelling, data mapping and quality assurance skills.
  • Insurance experience includes Personal Property Casualty and Commercial Automobile Product knowledge, Reinsurance for Commercial Property Casualty, Group and Life Insurance Systems.
  • Well versed in Guidewire Policy Center, Billing Center, Claim Center and Contact Center Data Analysis, Mapping, Conversion, Validation and Reconciliation Processes.
  • Experience with Data cleansing, Data Validation and creating Source to Target Data Mapping documents, creating Metadata documents and Source to Target data comparison.
  • Strong experience in Data Analysis, Data Profiling, Data Migration, Data Integration and Metadata Management Services.
  • Experience in working with different reporting tools like Cognos, OBIEE, Tableau, Business Objects, Microstrategy.
  • Strong working experience in Financial and Insurance industry with substantial knowledge of various front end and back end and overall end to end processes.
  • Experience in developing MDM integration plan and hub architecture for customers, products and vendors.
  • Experience with Guidewire insurance software implementing and extending Data Hub & Info Center data warehouse using SAP BODS ETL Tools.
  • Extensive experience in testing and implanting Extraction, Transformation and Loading of data from multiple sources into Data warehouse using Informatica
  • Experience in development of standards for data definitions, data element naming conventions, and logical/physical database design for applications and data warehouse development.
  • Strong working knowledge with SQL, ETL (Informatica, DB2), SQL Server, Confidential, SAS, Tableau and Jupiter while handling various Mat lab applications in multiple projects.
  • Proficiency in preparing ETL Mappings (Source - Stage, Stage-Integration, ISD), Requirements gathering, Data Reporting, Data visualization, Advanced business dashboard and presenting in front of clients.
  • Good command over Logical and Physical entity relationship data modeling using Erwin, Confidential Designer and Power Designer.
  • Experience on Informatica Source Analyser, Data warehousing designer, Mapping Designer Mapplet, and Transformations created mappings for the application.
  • Strong expertise in Excel spreadsheet, Pivot tables, graphs and charts for forecast modeling, business and executive reporting
  • Extensive experience in Data Analysis and ETL Techniques for loading high volumes of data and smooth structural flow of the data.
  • Extensive experience in Strategic development of a Data Warehouse and in Performing Data Analysis and Data Mapping from a Operational Data Store to a Enterprise Data Warehouse.
  • Strong experience in ETL, data validation using Informatica, Worked on Stage, Core & Work (Data warehouse) Tables.
  • Experience in Business Intelligence (BI) and Reporting technology, data analytics and KPI to provide pattern, trend, and insights for decision making.
  • Strong SQL query skills and Spark experience with designing and verifying Databases using Entity-Relationship Diagrams (ERD) and data profiling utilizing queries, dashboards, macros etc.
  • Very strong working experience with HP-ALM on requirements management, Defect management and responsible for various reconciliation activities.
  • Familiar with Database maintenance, installation and support, Date mining, Data Warehousing, datarepository, ETL.
  • Conducted meetings with the Guidewire Team elaborate numerous Billing Center OOB functionalities such as Close Policy Date, Trouble tickets and Holds, Payment Requests, etc.
  • Extensive experience in Object Oriented Analysis and Design (OOAD) techniques with UML using Flow Charts, Use Cases, Class Diagrams, Sequence Diagrams, Activity Diagrams and State Transition Diagrams.
  • Experience in SQL and PL/SQL, and working skills in developing complex queries, stored procedures, cube and views.
  • Expertise in handling various forms of Data like Master Data, Metadata, Source Data with the ability to provide Data analytics using various tools (Access, Excel, Reporting tools, etc.) and overall working Java experience in providing qualitative & quantitative assessment of data
  • Extensive experience working in back end tester by writing SQL Queries and PL/SQL scripts on large data warehouse systems involving Terabytes worth of data.
  • Very good understanding of Business Intelligence and Data Warehousing concepts with emphasis on ETL and Life Cycle Development including requirements analysis, design, development, testing and implementation.
  • Extensively worked on Dimensional modeling, Data cleansing and Data Staging of operational sources using ETL processes.
  • Experience in Data Transformation, Data Loading, Modeling and Performance Tuning.
  • Strong in database design, domain modelling and ER diagrams design
  • Highly motivated and a self starter with excellent communication and interpersonal skills
  • Strong analytical and problem solving skills, capable of addressing relevant facts and recommending solutions.
  • Used Team Site Content Management Tool to deploy and manage the content managed files.

PROFESSIONAL EXPERIENCE

Confidential, Westerville, OH

Sr. Data Analyst

Responsibilities:

  • Performed data mining of delinquency level data, predictive analysis of delinquencies, visualization, and exploration of data. Mapped legacy data with Policycenter target tables in the required data dictionary.
  • Excellent technical skills in SQL to extract, transform and loading data to the database using tools such as Guidewire Data hub.
  • Deliver end to end mapping from source (Guidewire application) to target (EDW)
  • Provide ETL logic to pull the data from Guidewire application database to ODS layer and into Guidewire data marts and EDW (Enterprise Data Warehouse).
  • Performed data integration and data mapping and data extraction management along with data cleansing while migrating data from legacy system to Guidewire Policy Center with Guidewire Data Hub.
  • Deriving the scope of testing by identifying the data flow from Rating systems and GuideWire Data hub.
  • Developed various Data Management strategy components such as incremental delivery of an Enterprise Data Warehouse, utilization of a data lake for ETL offload, legacy systems archive, and a data integration hub.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ .
  • Responsible for the data management and data cleansing activities using Informatica dataquality(IDQ)
  • Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate data, fixing the bad data, and fixing NULL values. Created mapping applications in IDQ to lad landing data.
  • Loaded the landing tables with the help of ETL jobs and IDQ mappings and workflows
  • Interacting with end users to gather the functional requirements and customize QlikView dashboards accordingly and involved in developing, enhancing, re-engineering, maintaining, supporting QlikView applications.
  • Involved in Fine tuning of database objects and server to ensure efficient data retrieval.
  • Responsible for getting approval of newly redesigned build Forms from Forms compliance team.
  • Interacting with Guidewire team and providing requirements for Forms data based on out of box feature of Guidewire
  • Based on analysis of legacy Forms data and stakeholders need interacting with Guidewire team about Cov Term patterns and coverage type (Electable, Required, Suggested) to print on Forms.
  • Worked on Risk reports to ensure chargeability conditions, durations, and other attributes (Age, level, counts, SVC codes etc.) are correctly translated and ready for PC use on various functionalities (like Losses Violations, Discounts etc.)
  • Actively participated in JAD sessions for overcoming project issues relating to data processing, user interface, and downstream system interactions.
  • Processed in the Mainframe from the front end CC, BC, PC & RC systems, using GUIDEWIRE working with SQL SERVER Data.
  • Gathered Product Backlog on the traditional claims system and newly integrated PolicyCenter and carefully elaborated application enhancement specifications detailing in scope/out of scope items, as-is/to-be process maps and critical test scenarios.
  • Experience with complex SQL queries for data profiling for Data ETL Integration tool.
  • Garnered technical prowess on Change Management by performing Impact Analysis for Change Requests obtained during Claims processing lifecycle.
  • Analysed complex business data stores and designed high level and detailed specifications for Informatica ETL mappings and workflows to transform and load mainframe and distributed system data to SAP Confidential database.
  • Frequently used Requirements Traceability Matrix (RTM) for identifying and tracing the linkages among PolicyCenter, BillingCenter and ClaimCenter.
  • Extensive experience in extracting UI based information by querying the Confidential database using SQL queries.
  • Used SQL during the testing phase to verify and validate that the correct information is being provided.
  • Orchestrated User Acceptance Testing (UAT) to guarantee that all the data model, business rules, workflows and user interface requirements have been fulfilled by the web application.
  • Analysed data errors using Informatica work flow monitor.
  • Worked closely with PO and Scrum Master during RACI review process and release management for seamless integration.

Confidential, Brea, CA

Data Analyst

Responsibilities:

  • Worked on building an Enterprise Data Warehouse to support software application (Guidewire) functionality release.
  • Enhanced the out of box Guidewire Policy Center, Billing Center and Claim Center data marts as per clients requirement
  • Deliver end to end mapping from source (Guidewire application) to target (EDW)
  • Extensive involved in analysis of data elements from worker's compensation that includes various premium components and its calculations, loss amounts & claim counts.
  • Review Business and Functional Requirements in the process of identifying the Data Elements associated at a Product level in Guidewire and was responsible for performing the Data Analysis.
  • Responded to production support issues and researched data anomalies in data ware-house against the Guidewire Claim Center system escalating to IT when necessary and keeping business stakeholders informed.
  • Gained a solid understanding of the Guidewire Claims Center application.
  • Performed data analytics on claims and data and the associated business processes.
  • Conducted analysis of business and user needs for information, features, reports, and functions in order to develop requirements for data analytics and reporting for operating units and Confidential functions.
  • Performed Data Analysis and validate the Data for the Personal and Commercial Line products and also participated in the Integration testing, System testing, Regression testing and Performance testing of Billing Center, Policy Center and Payment manager in Guidewire application based on the Data Mapping Requirements.
  • Developed a system alongside Sr. Analyst which will run in the database and spits out the appropriate data that is needed.
  • Participated in meetings which reviewed the system/process which was put in place to show and analyze the best possible way to present the proper data.
  • Designed and developed SQL Server Reports based on different business parameters and governance compliance.
  • Created several Data models and Reports, List of values and parameters, integrated it with OBIEE dashboards.
  • Worked along with Project Manager in defining the Project scope, allocating resources, budgeting, creating the Project Charter.
  • Involved in tracking, documenting, capturing, managing and communicating the requirements using rational requisite pro which helped in controlling numerous artifacts produced by teams who are working on the same projects, by keeping track of simultaneous update, change notification and multiple versions.
  • Wrote SQL queries for data validation, analysis and manipulation, and maintaining the integrity of the database.
  • Analyzed the flow of data into downstream data warehouse which fed to data marts and end systems that support analytical and operational reporting.
  • Responsible to extract specific data set from Operation database(ODS) by using PL/SQL query in order to data cleansing process and new feed processing purpose.
  • Performed various ETL task Extracted the data from Confidential, Flat Files, Excel, Teradata Transformed (Implemented required Business Logic) and Loaded into the target Data warehouse using SSIS (SQL Server 2008 R2).
  • Involved in ETL specification designing for incoming feed coming from various data source and Data manipulation on it.
  • Identified, researched, investigated, analyzed, defined and documented business processes, also worked closely with technical leads, information architects, and SQL developer team.
  • Prepared Business Object Models and Business Process Models that included modeling of all the activities of business from conceptual to procedural level.
  • Created and handled sales reports (Salesforce), activity reports, forecasting, referral summaries and other miscellaneous ad hoc reporting
  • Created the DMD (Data Mapping Document) to specify the Data mapping from various data base sources to the target database and used Rational Rose for developing Unified Modeling Language (UML) diagrams like Use Cases diagram, Class diagram, Sequence and Activity based diagrams.
  • Involved in creating and designing reports framework with dimensional data models for the data warehouse and worked with the development team on SQL Server 2005 tools like Integrations Services and Analysis Services.
  • Worked on requirements loading creating/updating test cases and tying them back to requirements, test execution and defect logging/tracking using Test Director.
  • Played a key role in planning the UAT, implementation of system enhancements and data migration and conversions.

Confidential, Madison, WI

Data Analyst

Responsibilities:

  • Developed Data Flow Diagrams and Data mapping documents for QAS integration with Guidewireapplication.
  • Gathered all the required data from QAS integration and Involved in Logical & Physical Data Modeling required in Guidewire applications.
  • Integrated Guidewire with QAS.com using Web services for address verification functionality implementation.
  • Provided support to the client team for the production implementation of Release 1 of GuidewireClaim Center.
  • Provide ETL logic to pull the data from Guidewire application database to ODS layer and into Guidewire data marts and EDW (Enterprise Data Warehouse).
  • Created/Reviewed data flow diagram to illustrate where data originates and how data flows within the Enterprise Data warehouse (EDW).
  • Worked closely with the Enterprise Data Warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process.
  • In charge of comprehensive Data Quality by making sure invalid, inconsistent or missing Obligor Risk Ratings are reported to portfolio managers for remediation and ensure that checks are in place to prevent the issue from re-occurring.
  • Facilitated transition of logical data models into the physical database design and recommended technical approaches for good data management practices.
  • Database table review and data mapping for large scale data conversion project Confidential database to Mainframe.
  • Wrote SQL queries for each Test case and executed in SQL Plus to validate the data between Enterprise Data Warehousing and Data Mart Staging Tables.
  • Created Data Flow Diagrams and Process Flow Diagrams for various load components like FTP Load, SQL Loader Load, ETL process and various other processes that required transformation.
  • Generated Statements of Work for the Design and Implementation of CRM solutions based on Salesforce.com Methodology.
  • Validated the test data in DB2 tables on Mainframes and on Teradata using SQL queries.
  • Transformed project data requirements into project data models.
  • Wrote Test cases for Enterprise Data Warehousing (EDW) Tables and Data Mart Staging Tables.
  • Worked on Performance, Tuning and loading data for fast access of reports in Client/Database. Server balancing, business Rules Implementation, Metadata, Data Profiling.
  • Generated reports through Business Objects and further analyzed data through Excel Used Pivot tables, Vlookups and Offset functions in order to quickly and effectively generate actionable and insightful analysis
  • Produced functional decomposition and logical models into an enterprise data model.
  • Involved in Data mapping specifications to create and execute detailed system test plans.
  • The data mapping specifies what data will be extracted from an internal data warehouse, transformed and sent to an external entity.
  • Extraction of test data from tables and loading of data into SQL tables.
  • Used Reverse Engineering to create data models of existing databases.
  • Conduct data mapping sessions.
  • Participated in UAT and worked with Mercury Quality Center for bug and defect tracking.
  • Used Test Director and Mercury Quality Center for updating the status of all the Test Cases & Test Scripts that are executed during testing process.

Confidential, Dublin, OH

Business Analyst

Responsibilities:

  • Researched and Analyzed Business needs and ways to improve property and casualty specifically policy contracts and processes administration.
  • Supported User-Acceptance phases by performing data validations and metrics calculation across every layer including source systems, Staging, Mart and Microstrategy layer.
  • Part of team conducting logical data analysis and data modelling, communicated data-related standards.
  • Worked with Data Warehouse in the development and execution of data conversion, data cleaning and standardization strategies and plans as several small tables are combined into one single data repository system MDM (Master Data Management).
  • Carefully examine data and reports to make sure conversion is proceeding correctly, run test scripts with various data to see how new or customized transactions process through the software and verify and validate accuracy of data through the generation of a variety of reports.
  • Provide support across different domain areas such as Auto Assignment, Payments, Claims Data and Appointment Logs, Auto Estimations, Fire Claims etc.
  • Captured business requirements in form of user stories using JAD sessions, brain storming sessions, focus groups and personal interviews to understand current policy administration system and ensured coverage on business values, story points and acceptance criteria.
  • Identified gaps in existing Policy Administrating Systems and Guidewire PolicyCenter and diligently designed as-is and to-be process, in scope and out of scope documents.
  • Created Use Case models for policy creation to policy end using UML diagrams like Use Case Diagrams, Sequence diagrams and Flow Diagrams using Rational Rose.
  • Analyzed and checked for missing values, data structures, data sanity of the legacy system's datausing R Studio.
  • Trouble shoot overlap and data quality issues by looking through the office of existing policy administration's data lineage, systems, and triggers for signs of bugs and malfunction.
  • Performed data integration and data mapping and data extraction management along with datacleansing while migrating data from legacy system to Guidewire PolicyCenter with Guidewire DataHub.
  • Elicited, documented, modeled, validated and prioritized functional and non-functional requirements using interviews & data analysis for insurance policy lifecycle i.e. right from policy creation to policy renewal.
  • Collaborated with the Scrum Master and Scrum teams in Estimation and Velocity planning. Also, participated in conducting Sprint Planning, Daily Stand Up, Sprint Review & Retrospective meeting.
  • Assisted the Scrum Master to prepare and monitor burn-down charts for the successful completion of tasks within the time-boxes.
  • Managed sprints i.e. burn down charts, project velocity, sprint backlogs and product backlogs using JIRA.
  • Performed data integration and data mapping management along with data cleansing while migrating data from legacy system to Guidewire PolicyCenter.
  • Performed Walkthroughs of business requirements and functional requirements with the development and testing teams.
  • Set up definitions and process for scoping, document test cases, test phases included integration testing, end to end testing and user acceptance test (UAT) and documenting the results for policy underwriting rules, rewriting rules, policy rate making and pricing.
  • Analyzed data patterns and trends using Power BI at the time of analysis and UAT.

Confidential, Tallahassee, FL

Data Analyst

Responsibilities:

  • Involved in data conversions and extracted data using SSIS packages (ETL) to transfer data from different server locations and heterogeneous sources like Confidential, Excel, CSV, flat file, XML and Text Format Data.
  • Managed sprints i.e. burn down charts, project velocity, sprint backlogs and product backlogs using JIRA.
  • Shared and tracked version histories of product backlog documents, sprint backlog documents and burn down charts with Confluence tool.
  • Prepared mockup with sample data and created a model with extract, transform and load (ETL) principles in Qlikview.
  • Involved in ETL specification designing for incoming feed coming from various data source and Data manipulation on it.
  • Performed data integration and data mapping management along with data cleansing while migrating data from legacy system.
  • Responsible for Designing Logical and Physical data modelling for various data sources on Confidential Redshift.
  • Knowledge with data visualization tools (charts and graphs) such as Tableau and strong desire to build this skill.
  • Worked on various projects to migrate data from on premise databases to Confidential Redshift, RDS and S3
  • Data Warehousing Design experience and Kimball Methodologies.
  • May require work with data governance policies, Metadata, Data Quality or Change Management.
  • Provides team leadership and management of all global data management and data integration services.
  • Performed data integration and data mapping and data extraction management along with data cleansing while migrating data from legacy system to Guidewire Policy Center with Guidewire Data Hub.
  • Involved in developing a file of ACORD Forms used as the standards in all Property and Casualty markets, for both Personal and Commercial Lines of Business.
  • Involve in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Interacting with Guidewire team and providing requirements for Forms databased on out of box feature of Guidewire
  • Involved in SQL the fields from an additional claims application from the vendor TOS, checking that the data against the Claim Center for Guidewire are displayed correctly.
  • Created the DMD (Data Mapping Document) to specify the Data mapping from various data base sources to the target database and used Rational Rose for developing Unified Modeling Language (UML) diagrams like Use Cases diagram, Class diagram, Sequence and Activity based diagrams.
  • Prepared mockup with sample data and created a model with extract, transform and load (ETL) principles in Qlikview.
  • Involved in ETL specification designing for incoming feed coming from various data source and Data manipulation on it.
  • Performed data integration and data mapping management along with data cleansing while migrating data from legacy system to Guidewire PolicyCenter.
  • Created the DMD (Data Mapping Document) to specify the Data mapping from various data base sources to the target database and used Rational Rose for developing Unified Modeling Language (UML) diagrams like Use Cases diagram, Class diagram, Sequence and Activity based diagrams.

We'd love your feedback!