- 8+ Years of professional experience as Business Data Analyst in Banking, Capital Markets, Compliance, Equities, Treasury, Finance, Insurance, Loan and Investment Banking.
- Strong experience in Business and Data Analysis, Data Profiling, Data Migration, Data Conversion, Data Quality, Data Integration and Metadata Management Services and Configuration Management
- Expertise’s in Data Modeling, Azure Cloud, Big data, Evaluating Data Sourcesand strong understanding of Data Warehouse/Data Mart Design, ETL, BI, OLAP, Client/Server applications.
- Designed Physical Data Model (PDM) using IBM Info sphere Data Architect data modeling tool and ORACLE PL/SQL.
- Experience in working with Data Management and Data Governance based assignments with reporting requirements and managing data quality
- Experience on Migrating SQL database toAzure Data Lake, Azure data lake Analytics,Azure SQL Database, Data BricksandAzure SQL Data warehouseand Controlling and granting database accessandMigrating On premise databases toAzure Data lake storeusing Azure Data factory.
- Extensive experience in conducting Market Research, Feasibility Studies, Data Analyses, Data Mapping, Data Profiling, Gap Analyses, Risk Identification, Risk Assessment, Risks Analyses and Risk management.
- Implemented variousDataMining techniques using variousPythonpackages, SPSS Statistics and SPSS Modeler tools.
- Expertise and hands on experience of migrating and upgrading insurance suite solutions to Guidewire solutions as well as integrating third party solutions to Guidewire solutions.
- Implemented Actimize Anti - Money Laundering (AML) system to monitor suspicious transactions and enhance regulatory compliance and Involved in Security testing for differentLDAProles.
- Strong domain knowledge in the areas ofCDD, OFAC, transaction monitoring, Fraud and suspicious activity reporting.
- Extensive experience with the SAS programming and in data step and with various SAS Procedures in Base SAS and SAS/Stat, including thorough knowledge of SAS Macro.
- Experience in development methodologies like RUP, SDLC, AGILE, SCRUM and Waterfall.
- Expert in TSQL DDL/DML, perform most of the SQL Server Enterprise Manager and Management Studio functionality using T-SQL Scripts and Batches
- Have extensive knowledge in Data flow modeling and Object modeling, case analysis and functional decomposition analysis.
- Provide guidance on architectural and data modelling issues forBanking and Insurancedomain projects.
- Extensive Experience Guidewire Claim Center - Financials (Reserves, Payments, Recovery, Recovery Reserves, Cash Flow Management) Knowledge of Auto Claims Processing and Underwriting of insurance products.
- Expertise in Data Analysisfunctions includeData Virtualization, Data Visualization and Data Transformation & Sourcing. Extensive experience withSQL, ETL (SSIS), PL/SQL and advanced Excel.
- Experienced in handling of large amounts of data (Data Cleansing, Data Profiling & Data Scrubbing and Data Blending). Worked onData quality, Data integrity, Master Data Management.
- Expertise in Database Administration of Test Environment Databases, Data Pools and identifying the performance issues due to inefficient Database Administration
- Proficient in developing Use Case Model, Analysis Model, Design Model, Implementation Model, Use Case Diagrams, Behavior Diagrams (Sequence diagrams, Collaboration diagrams, State chart diagrams, Activity diagrams), Class Diagrams based on UML using Rational Rose.
- Experience in Business Intelligence (BI) Technologies like Microsoft Business Intelligence (SSIS, SSAS, and SSRS), Informatica, Business Objects and OBIEE
- Extensive Data Warehousing experience using Informatica as ETL tool on various data bases like Oracle, SQL Server, TeraData, MS Access.
- Excellent in creating various artifacts for projects which includespecification documents, data mappingand data analysis documents.
- Expert level experience in MS Excel including Macros, Lookups and Pivot Tables and strong expertise in writingcomplex SQL queries and Strong experience inBI reportingincluding Cognos and Spotfire.
- Sound Proficiency in analyzing and Creating Use cases, Use Case Diagrams, Swim lane Diagrams, Activity diagrams, Class diagrams, Data Flow Diagrams, Business Flow Diagrams and Navigational flow diagram and Sequence diagrams using Rational Rose and MS Visio.
- Experience in dealing with different data sources ranging fromFlat files, Python, SQL server, Oracle, MySql, MS Access and Excel.
- Experience in RUP framework, rational tools including Requisite Pro Rose and Clear Quest.
- Involved in GUI testing, Functionality Testing, Regression testing, System Testing, Unit Testing, Integration Testing, Performance Testing, and Stress Testing.
- Strong experience in conducting User Acceptance Testing (UAT) and documentation of Test Cases and in designing and developing Test Plans and Test Scripts.
Confidential - Parsippany, NJ
Sr. Business Data Analyst
- Documented alldatamapping and transformation processes in the Functional Design documents based on the business requirements.
- Involved inDatamining, transformation and loading from the source systems to the target system.
- Involved in defining the source to targetdatamappings, business rules, and business anddata definitions.
- Provided qualitydatareview for completeness, inconsistencies, erroneous and missingdata according todatareview plan.
- Prepared High Level LogicalDataModels using Erwin, and later translated the model into physical model using the Forward Engineering technique.
- Madedatachart presentations and coded variables from originaldata, conducted statistical analysis as and when required and provided summaries of analysis.
- Was responsible in using MDM to decide how to manage their life cycle, cardinality, complexity and Collect and analyze metadata about the masterdata.
- Loaded operationaldatafrom Oracle, flat files, XML files, Excel spreadsheets into Oracle targetdatamart and usedInformatica Power Exchange for mainframe sources, such asCOBOLfiles.
- Extensively used Informatica Power Center Designer to create a variety of mappings including expression, sequencegenerator, source qualifier, router, joiner, update strategy, aggregator, stored procedure, and lookup transformations with embedded business logic.
- Involved in extensiveDATAvalidation using SQL queries and back-end testing.
- Contributed to the initialdatamining work and development of tools and technology.
- Initiated detailed discussions/functional walkthroughs with stakeholders.
- Produced system documents, functional requirements, ETL mapping and modeling deliverables Worked with root cause ofdatainaccuracies and improved the quality ofdataby usingDataFlux.
- Wrote complex SQL, T/SQL Testing scripts for Backend Testing of thedatawarehouse application.
- Assisted developers and testing teams to ensure that the requirements are communicated accurately and that the system being developed is verifiable against the requirements.
- Wrote SQL queries to validate sourcedataversusdatain thedatawarehouse including identification of duplicate records.
- Tested Ad-hoc reports, drill down and drill through reports using SSRS.
- Tested different detail, summary reports and on demand reports.
- Involved in pre and post session migration planning for optimizingdataload performance.
- Facilitated periodicdataloading by using reusable transformations, mapplets and worklets.
- Used parameter files to define values for mapping parameters and variables to provide flexibility.
- Implemented the Slowly Changing Dimensions (SCD) of type 1 and type 2 (flag, date) for the purpose of incrementalloading of target database.
- Worked on performance tuning of the SQL queries, Python, Informatica mappings, sessions, and workflows to achieve fasterdataloads to thedatamart.
- Extensively used SQL DDL/DML commands with TOAD environment to create target tables and perform different testing for accuracy & quality.
Confidential, New York City, NY
Sr. Data Analyst
- Worked on Data Management and Business Solutions team, worked on Data Analysis, Requirement gathering, GAP analysis, Data Mapping and Testing of ETL and BI Reporting requirements
- Created source to Target data mapping document for reconciliation between different data systems.
- Report Development using AXIOM Controller View creation of Data Sources, Data Models, Shorthand, Portfolios, Aggregations, Free Form and Tabular Report and workflows using AXIOM Controller View tool
- UsedAgile/Scrum method for gathering requirement, documenting user stories to discuss project specifications
- Worked on business requirements from CIB and IHC Financial Reporting, Management Reporting, and Regulatory Reporting work streams to improve the process and quality of Board reports, management reports, financial statements, and regulatory reports (i.e. FRY-9C, FRY-9LP, FRY-12, FRY-15, and FRY-11).
- Expertise in data integration, MDM data governance, data stewardship, data profiling, data quality, data modeling, data analytics, data referencing, data architecture, metadata management, reference data management and master data management.
- CreatePricingReport along with creating Pivot Table for review and compare against.
- Worked on Gap analysis (AS-IS) & (TO-BE) by identifying existing Data processes, technologies, and documenting the enhancements needed to meet the end state requirements.
- Extract Transform and Load data from Sources Systems to Azure Data Storage services using a combination of Azure Data Factory, T-SQL, Spark SQL and U-SQL Azure Data Lake Analytics . Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in InAzure Databricks.
- Worked with users and IT to confirm manual data input templates which could be configured by AXIOM, plan Feed and Line Automation, and create Functional Requirements Document (FRD).
- Worked on Exporting large data sets toexcelrelational worksheets,pivottables, create massive formulas andVLOOKUP’sto compare data and perform searches
- Handle several projects for the HR and Finance areas as well as Revenue Accounting.
- Migrated, aggregated Legacy infrastructures into Hadoop Clusters using SQL and Map reducing techniques.
- Deploying a tool that analyzed absolute products leading to productivity savings developed a model to forecast demand for different Exchanges.
- Interacted with Cash Management Division and Compliance Office to review the AML (Anti Money Laundering) requirements and provide Positive Pay service to reduce check fraud
- Created use cases to depict the interaction between the various actors and the system and created data flow models, data quality analysis and performed cost/benefit analysis.
- Used advanced functions like VLOOKUP’s, Pivots, graphs, and analytical and statistical tool packs in Excel.
- Deployed Investment strategies which could adapt to rapid changes in the market using recurrent using AWS-Sage maker and Azure MiLB.
- Worked with senior developers to implement ad-hoc and standard reports using Informatica, Cognos, MS SSRS and SSAS.
- Extensively involved in Data extractions, Data mapping from source to target systems using customized PL/SQL code andUnix/Linux platform.
- Upgrading the application to comply with the new financial regulations for finance industry and implementing MISMO standards.
- Development of Programs in COBOL for data extraction, Cleansing and loading into Data Warehouse.
- Involved in the competitive market analysis research for the Production team to provide insight into enhancements using Power BI.
- Developed the mutual fund benchmark specific test plans, test scenarios, test cases test conditions to be used in testing based on requirements, migrated benchmark indices securities account reference data
- Facilitated the User Acceptance Testing (UAT) with Pega System Administrators and Business Users, documented any issues or defects and eventually got sign off from the right parties.
Confidential, Cincinnati, OH
Business Data Analyst
- Gathered requirements from remotely based business users and defined and elaborated the requirements by holding meetings with the users (who are also Fifth-third employees).
- Designed and implemented basic SQL queries for QA Testing and Report / Data Validation.
- Created SAS programs that are used for data validation, statistical report generation and mortgage program validation and automated the Edit Check programs using Macros
- Wrote the test cases and technical requirements and got them electronically signed off and Created new reports based on requirements.
- Expertise in Automation of dashboards of the entire Investment Banking operations Capital Markets.
- Identify and assess new and existing risk and elevate any concerns to Capital Markets Senior Management to mitigate bank exposure or potential loss.
- Utilized simple methods like PowerPoint presentations while conducting walkthroughs with the stakeholders.
- Conducted GAP analysis so as to analyze the variance between the system capabilities and business requirements.
- Wrote BRD, FRD, use cases, test scenarios, test cases for testing the functional and non-functional aspects of both ETL jobs and Reporting jobs
- Extensive experience in developing and enhancing applications for financial processes including Hedge Funds Pricing, Trade and Settlements operations.
- Implementing MDM data governance, data quality, and metadata solutions supporting Basel I/II/III Compliance from Basel Requirements as it relates to Data Governance, Data Quality and Metadata Management.
- Interacted with teams in AFS, ACBS and infolease to extract the information for the reports.
- Involved in defining the source to target data mappings, business rules, business and data definitions
- Worked on daily basis with the main frame team and lead data warehouse developers to evaluate impact on current implementation.
- Worked with Central Database Management (ITIL) for Asset Management, Portfolio Management Modules, CMDB, Remedy, and SharePoint.
- Lead Business Intelligence reports development efforts by working closely with Microstrategy, Teradata, and ETL team
- Senior level SQL query skills (Oracle and TSQL) in analyzing and validating SSIS ETL database data ware house processes
- Draft daily hedge strategy reports detailing market analysis, adjusted portfolio coverage and executed trades during the day to brief the Director of Capital Markets and COO/CAO of the organization.
- Architect & implement medium to large scale BI solutions on Azure using Azure Data Platform services (Azure Data Lake, Data Factory, Data Lake Analytics, Stream Analytics, Azure SQL DW, HDInsight/Databricks, NoSQL DB)
- Involved in conducting JAD sessions to identify the source systems and data needed by Actimize-SAM (KYC/CIP).
- Expertise in Product Life-cycle Management skills for Financial verticals viz. Golden Source Enterprise Reference Data Management for Securities, Customers & Counterparties, Stock Market, Corporate Actions.
- Work with region and country AML Compliance leads to support start-up of compliance-led projects at regional and country levels Description of End-to-end development of Actimize models for trading compliance solutions of the the project bank.
- Worked on Multiple databases in Mainframe and Client Server (SQL Server, DB2, IMS, VSAM, ISAM)
- Create Data governance documents for lineage, data referencing and meta data documentation.
- Worked with the Enterprise Dataware house team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process
- Upgraded the present application by adding new functionalities and adding new reports and wrote regression test cases, did smoke testing with users.
Confidential, San Francisco, California
- Used Data Warehouse Architecture and Design - Star or Snowflake Schema, FACT and Dimensional Tables, Physical/Logical Data Modeling and data flow diagrams.
- Gathered requirements from Business Owners, stakeholders, Data Governance Team and the subject matter experts through meetings to understand needs of the system.
- Implemented and followed a Scrum Agile development methodology within the cross functional team and acted as a liaison between the business user group and the technical team.
- Involved in SDLC including requirements gathering, designing, developing, testing, and release to the Production environment.
- Involved in developing and maintaining the Requirement Traceability Matrix (RTM).
- Analysis of the Enterprise Data Warehouse in Teradata to create accurate fact and dimension tables so that it can be queried by OBIEE.
- Participated with the team in building the Data Quality projects/solutions using Trillium Software System for customers.
- Involved in Creating impactful dashboard in excel for data reporting by using Get pivot and VLOOKUP, helped in transforming raw data into meaningful, actionable information.
- Moderate Linux Red hat, UNIX Commands awareness, Application Automated Performance Monitor and Resource Analysis,OracleDBtools, troubleshooting and system application log reviews to maximize speed and accuracy.
- Extensively done ETL process and extracted the data from different source systems like Teradata, Sales force, MS SQL Server, Flat files, XML.
- Performed numerous Ad-hoc requests, financial reports involving SQL scripts, UNIX, SAS and Teradata.
- Reviewed basic SQL queries and edited inner, left, and right joins in Tableau Desktop by connecting live/dynamic and static datasets.
- Liaison between Data Warehouse and marketing for using customer data efficiently.
- Used SharePoint to store and manage documents in a single repository using a secure web-based system
- Worked on evaluating, comparing different tools for normalized & de normalized test data management with Hadoop.
- Involved in developing and maintain ORS Model andMDMrules and guidance etc.
- Involved in Unit testing and delivered Unit test plans and documented.
Confidential, Chicago, IL
- Gathered requirements to customize ATG provided solutions, services, and ongoing guidance to power a more relevant and personal e-commerce.
- Involved in designing and developing Data Models and Data Marts that support the Business Intelligence Data Warehouse.
- Acts as a liaison between Portfolio Management and Configuration Team with focus on data extraction across SAP and BIQ and directing the analysis with automated systems across BIQ.
- Partially involved in creating and maintaining data dictionaries for naming various object in the finance domain.
- Using Shared Containers and creating reusable components for local and shared use in the ETL process.
- Worked on creating and maintain sales reporting using in MS Excel queries, SQL in Teradata, and MS Access, produce performance reports and implement changes for improved reporting.
- Creating Excel templates created Pivot Tables and utilized VLOOKUPs with complex formulas.
- Researched Banking systems on mainframe and Client server to enhance system.
- Provided weekly, monthly & ad hoc web analytics reports using Adobe Site Catalyst & Google Analytics.
- Worked on developing Tableau data visualization using Cross Map, Scatter Plots, Geographic Map, Pie Charts and Bar Charts, Page Trails, and Density Chart.
- Used InformaticaPower Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Substantial report development experience utilizing SQL Server Reporting Services (SSRS), Congo’s Impromptu, and Microsoft Excel.
- Writing PL/SQL procedures for processing business logic in the database. Tuning of SQL queries for better performance.
- Involved in developing SQL-based data warehouse environments and created multiple custom database applications for data archiving, analysis, and reporting purposes.
- Data mapping, logical data modeling, created class diagrams and ER diagrams and used SQL queries to filter data within the Oracle database.
- Worked with the Enterprise Data warehouse team and Business Intelligence Architecture team to understand repository objects that support the business requirement and process
- Involved in designing and implementing basic SQL queries for QA Testing and Report / Data Validation.
- Creation of UML structure and behavior diagrams, Reporting using advanced Word/Excel skills (pivot tables, charting, VBA etc), writing basic SQL scripts.