- Having strong IT background with experience in development and implementation of about 8+ Years in Data (MDM, EDM, DWBI) and Reports delivery with data/business models.
- Proficient knowledge in in DB2, Oracle, SqlServer, Teradata, ETL (Datastage, SSIS, Informatica) and reporting tools (SSRS, Cognos, Tableau). Excellent understanding of retail and financial business operations and various analytical tools for effective analysis of data.
- In - depth knowledge of data models, data management, structured and unstructured data, DDL Scripts, Keys & Indexes & Table Partitioning, External Tables, Data Federation, Data Models, DW/BI methodologies, DWH appliances, OLAP, BI reporting using data discovery tool, visualization tool with analytics capabilities, Spreadsheet driven reports, IT-driven BI& business driven-BIreports automation.
- Proven delivery ofupstream and downstream EDW architecture, data artifacts, business models/data, GAP analysis, Impact analysis, future-state designs, TO-BE process, process/data governance &data migration strategies, reports, dashboards, scorecards delivery in real time or from OLTP or ODS orDWH or Data Mart or VIRTUAL DWH. Experienced with proto-typing and improvements for scalable&flexible data model designs and system designs.
- Solid understanding of software development (SDLC), change management lifecycles, E2E testing & release procedures.
- ED EDM/ Data Analysis/Database/ETL/Messaging/Reporting/Modelling
- Expertise in data analysison excel data source,ERWIN Data Modelling (CDM, LDM, PDM), SQL QUERIES, UNIX Scripting, OLTP, EDW, DWH and data marts with good understanding of all structured and unstructured data types and analytical functions.
- DW/BI, OLAP, ETL (INFORMATICA, SSIS), MQ, DESKI/WEBI Business Objects, XCELCIUS, Micro Strategy (data visualization tool),Micro strategy, Oracle Discoverer, Composite (query federator tool), MSACCESS, EXCEL, VISIO/ARIS, JIRA, CLARITY. Familiar with Tableau and BIG Data Hortonworks, AWS, NoSQL DB.
- TOAD,SQL,PL/SQL developer, data profiling, Oracle 11g/12c, Oracle EXADATA, NETEZZA, TERADATA FSLDM, Golden Source, SAS analytics, SQL Server. Experienced in automations using AUTOSYS, CONTROL-M.
- Well Versed with Real TIME ETL, SOAP UI,API calls, POSTMAN, JSON,XML, TIME-SERIES data. Familiar with Macro and Python.
- Conceptual, Logical and Physical Data Modeling(ERWIN 7.3 & 8.x),Designer 2000, Embarcadero, Enterprise Data Management (EDM), Master Data Management(MDM),business intelligence, analytics, data quality, and data governance.
- Database Normalization/De-normalization techniques, data integration and complex SQL queries performance tuning in OLTP, ODS, EDW/DWH and Data mart and BI REPORTING.
- RELATIONAL & Dimensional modeling for Data Mart/Data Warehouse and exposure on various ETL/ELT tools.
Confidential, Dallas, TX
Senior Data Analyst - Data Solutions Delivery
- Proposed data solutions, design Data models & ETL solutions covering 128 countries for Financial Data Consolidation, Reporting & Data Distribution Requirements. Articulate changes in DWH data model and co-ordinate with on-shore and off-shore teams on UAT and monthly release using technologies (ORACLE SQL, PL/SQL, INFORMATICA, UNIX, AUTOSYS, MICRO STRATEGY).
- Mapping of business processes and Creation of CDM/LDM& ETL/DQ Rules/API Designs to cater for new business requirements incorporating data standardization strategies and presentation to IT & business teams.
- Document the data flow and methods for data ingestion to hub as well as data distribution from hub via DVL layer View.
- Document the flow and calculation of key REF Data, risk metrics/measures (30+ Banking Risk Indicators and SCORES) andassociated meta data in DWH and explain to the Technology.
- Solidify Data Requirements and identify gaps in DATA MODEL. Prepare/Reviewprocess/data flow diagrams, Operational Process and Technical flows, Business Rule and Data mapping specifications, traceability matrix, use cases, FSD, system user guides and assist in user trainings.
- Document logic for data collection and aggregation of historical data across full time-series and consolidation in DWH
- Identify data and process integration impacts for new platform implementation with Real time ETL and overnight batch ETL.
- Analysis to identify critical data elements, data setup strategyand Mappings for CMP GOLDEN SOURCE 8.x(Silver/Gold data) REFERENCE DATA PLATFORM. Perform Data Analysis and Data Profiling using ORACLE SQL
- Identifying DATA GOVERNANCE issues and formulating refined business process and data flow for long term solution.
- PrepareData Flows and SQL QUERIES for DATA PROFILING/DATA QUALITY check list on Source/Target data
- Designed EXCEL FUNCTIONS to handle OFF MODEL countries Ingestion.
- Identify and analyze new sources for Phase 2 and researching the API data retrieval mechanisms and data integrations from BEA website. Analysis and support data loading in Cloudera for new sources for POC.
- Provide inputs on data quality improvements and data/business transformations rules as Data Expert/Data SME
- End to End solution of Data ingestion for Platts Datawith APIprocess flow to retrieve data in JSON and XML format for multiple parameters covering full TIME SERIES. Designed template for Indicator/Country level notes/labels for all the CCC indicators to have single source for update
- Documented the end to end data solution design with FULL DATA LINEAGE and traceability for PLATTS Commodities
- Data Analysis to identify the data sources for 20 indicators to meet the MOCKUP Requirements and identify INFORMATICA ETL changes, data model changes and SQL VIEW changesand prepare test data for USA/EMEA Key Driver reports.
- Prepare data dictionaries and data element naming standards and conventions for multiple computing environments.
Confidential, Omaha, NE
Senior Data Consultant
- Analysis to identify data model changes and critical data elements (for direct feeds from upstream source systems and strategic feeds from EDW) for replacement of legacy compliance system for screening against Sanctions and Due Diligence Lists (PEP, Adverse Media and AML) for all segments (i.e. CIB, Private Banking, Retail, Commercial, Employees and Securities) and related parties screening. Discuss and validate end-to-end solution for each source system screening by FIRCOSOFT and modelling the data & reporting needs.
- Analysis and design for Enhancements in internal list managements system (VERITAS) and internal AML watch list data quality improvements for 45 countries
- Analysis and design End to End Automation of reference files (External Acuity Watch Lists, Internal Watch Lists, Resource files, Algorithm files) and auto publishing of files to name and transaction screening systems.
- Work with Technology and Architecture team to design and develop data solutions in the Anti Money laundering (AML Screening GNS) and/or Trade Surveillance(TSaaS)
- Provide inputs on creating Tier 2 (TERADATA FSLDM) data model&mappings and Tier 3 downstream views based on Tier 1 source data. Reconciliation of data from source to target systems.
- Qualitative and quantitative data analysis, validating ETL Rules and DQ rules in line with DQMF (Data Quality Monitoring Framework) for risk management.
- Participate in the development and maintenance of corporate data architecture, data management standards and conventions, data dictionaries and data element naming standards and conventions for multiple computing environments
- Support data review processes by collecting information and fulfil data and reporting requests received from internal stakeholders. Provide inputs on data quality improvements and data/business transformations rules as Data SME
- Identify data andprocess integration impacts for new platform implementation. Enhancing and standardizing the interfaces for the Excel Loader data/list entries integration and Data provisioning for Management Information reports
- Coordinates new data development and ensures consistency and integration within existing application systems.
- Partner with technology to “translate” the business requirements into technical requirements and analysis of data gaps.
- Document the existing process and conduct analysis to identify inefficiencies and opportunities.
- Recommend solutions and options for system performance improvements andBuild POC (Proof of Concept) for application design and reporting requirements
- Solidifying the conceptual/logical model and data governance and formalize migrations and distribution of data to downstream operations systems. Assist in system testing, SIT and UAT and production dry runs.
- Providing regular updates(PPT), RAID log, and Project Plans to stake holders and Management. Managing CR, Issues Log and tracking.
Business Intelligence Analyst
- Successfully managed a team during migration of Confidential products from Windows to Linux environment.
- Translated requirements into business rules, setting and reviewing acceptance criteria and defined technical specifications.
- Responsible to manage data, verification & validation of the model thus enhancing effective data modeling.
- Created MS SSIS packages to populate data from various data sources Flat Files, Excel Files, OLE DB to SQL Server.
- Implemented complex joins, aggregate functions and sub queries in SQL to retrive data for analysis.
- Performed data analysis, statistical analysis, generated reports, listings and graphs using Tableau.
- As an analyst performed analytics on call processing data and reported those insights, thus increasing system performance.
- Streamlined data from databases to run several machine learning algorithms on R to capture trends.
- Built and validated variety of statistical models, providing analytical insights to improve criteria and strategy.
- Presented user stories and status updates through power point presentations to the clients.
- Created reports on Tableau Desktop, published them on Tableau server and presented the insights through video conferences to clients and internal peers in every sprint review meeting.
- Worked with cross functional teams from different locations such as Sweden, China, Italy and Australia.
- Identifying and analyzing activity diagrams, Use case diagrams, functional models, structural models, behavioral models etc.
- Understand client products and create data mapping specifications as needed for the business intelligence teams.
- Perform data discovery, profiling and analysis by source.
- Build and debug stored procedures to support ETL processes.
- Build SSIS packages to implement audit, balance and control framework components.
- Develop and debug reusable ETL components usind for ETL architecture.
- Building SSRS reports that can be exposed to / emails to FT stakeholders.
- Assisted in production OLAP cubes, wrote queries to produce reports using SQL Server 2012/2014 Analysis Services (SSAS) and Reporting service (SSRS).
- Conduct E.T.L. performance tuning, troubleshooting, support and capacity estimation
- Created drill down, drill through, cascaded, sub, and parameterized reports in SSRS.
- Collaborated in data analysis, profiling and cleaning of business data.
- Created & exported data visualizations (worksheets & dashboards) in Tableau Desktop 9/Power BI for adhoc project requests.
- Understanding and translation of various existing SSRS reports and Tableau dashboards.
- Automated the Reports using various SSIS packages and jobs.
- Involved in analyzing, defining, creating, evaluating-baselines (scope, time, quality) & estimation.
- Involved in preparing project charter and SOW for the project.
- Perform AS-IS analysis and TO-BE analysis and authored BRD documents.
Environment: PL/SQL, Oracle 11g, SQL Loader, Unix shell scripting, Java, Perl, Windows,Linux
- Responsible to write PL/SQL procedure, functions, triggers, materialized views and optimizing existing modules.
- Responsible to perform basic DBA functions - installation, user management, space and session management.
- Logical and physical database design, review and analysis.
- Cold and Hot database back up using RMAN.
- Load data into Oracle DB from different reports, flat files, csv files and log files using SQL loader.
- Prepared PL/SQL Packages, Procedures, Functions, Triggers, Views, Indexes.
- Writing Sequences, Views and Materialized views as per requirements.
- Performance Tuning - SQL.
- Performance improvement and resolving maintenance issue like lock removals.
- Performed Oracle SQL Performance Tuning on the Developed Code using Explain Plan.
- Advanced PL/SQL concepts like Varrays, Associative Arrays, Nested Tables, PL/SQL optimization, Table Functions and Bulk transactions.
Environment: Java, Oracle9i, PL/SQL, Unix, Shell Scripting.
- Performed regression, system, functional and performance testing on the released packages for multiple Confidential nodes.
- Automated SQL scripts to implement functional test cases.
- Documented, tracked and communicated software test plans, test scripts, test analyses and unresolved issues.
- Managed, assigned and followed up trouble reports using JIRA software.
- Created and maintained Operating System Installation Documents.
- Involved in Application analysis, Design, Coding, Testing, Development and Implementation.
- Authored PL/SQL and SQL Scripts required for day to day maintenance in Production support
- Documenting and defining package descriptions.
- Documenting IN and OUT parameters for different function and procedures after discussing with design teams
- Modified Stored Procedures, Stored Functions to existing applications.
- Involved in Maintenance & Enhancements of the existing applications using PL/SQL code.