- Senior Data Modeler Consultant with over 16 years of in - depth and comprehensive hands on experience that includes Database design and development, Enterprise Data Architecture solutions, Transactional Database Design, Data warehousing Strategies and Solutions, ETL design & development, Dimensional Modeling (Star & Snowflake schema), Data Mapping, Data Conversion, Data Profiling and Data Analysis.
- Analytical thinker with demonstrated strengths in establishing strong partnership with IT and Business team members and vendors, at all management levels. Demonstrated strength in managing all aspects of Software Development Life Cycle (SDLC) in meeting enterprise wide needs, using best practices. Proven track record in understanding business needs and establishing an architectural vision.
- Strong PL/SQL development background with excellent Performance tuning expertise. Involved in Stored procedure development and Worked as a mentor in previous engagements to help developers code efficiently.
AREAS OF EXPERTISE
- Architecture, Analysis & Design
- OLTP database design and implementation
- Dimensional Modeling
- ETL Design & Development
- Capacity Planning
- Data Warehousing
- Functional / Technical Design
Data Warehouse Architect/Modeler
- Solely responsible for the architecture, build and maintenance of GRC Data Warehouse from the ground up. Created conceptual, logical and physical models using Sybase Power Designer.
- Created ETL specifications for over 150 ETLs and worked with Offshore team to implement them using informatica Powercenter.
- Proactively implemented ETLs to help in reconciliation checks between source and target databases
- Designed summary tables to improve report performance.
- Implemented models to support Process, Risk and Control taxonomies and Risk Control Framework taxonomies.
- Conduct regular meetings with Business stake holders and Analysts to understand requirements and extend the model accordingly.
- Created several VB scripts using Powerdesigner supported meta data model and improve productivity for modeling.
- Use of NSM files to implement enterprise Confidential abbreviations across the model.
- Implemented DW using type-2 format in order to be able to present point in time snapshot of the data;
- Involved in migration of data from legacy systems into the warehouse.
Tools: Power Designer 16.5, DB Artisan, IBM DB2 LUW, MS Visio, MS Word, IBM Open pages, MS Office tools, VB Scripting
Data warehouse Data Modeler
- Logical and physical modeling using Sybase Power Designer
- Creation of DDL scripts and manage them using SVN tools
- Developed several VB scripts for use with Power Designer models to improve consistency, productivity and reporting purposes.
- Setup and managed NSM files to implement bank naming standards
- Conducted and Participated in model walkthroughs with internal groups as well as bank’s information architecture team review members.
- Regularly Updated tasks and sub-tasks using JIRA for review in the daily Scrum calls
- Work assignment for offshore teams and monitor the progress and review of the implementation
- Review of inefficient SQL queries for performance tuning
- Release management of DDL through various environments
- Developed Perl scripts for efficient release management.
- Work with developers to ensure optimal implementation of the model.
Tools: Power Designer 16.0, Toad, Ultra Edit, MS Visio, MS Word, Oracle 11g, JIRA, SVN, PL/SQL, Perl Scripting, Spot fire reporting tool.
Data architect/data modeler
- Conduct meetings with Business Analysts and Business stake holders to understand the requirements
- Using SQL Developer, query and analyze existing data, identify the entities and understand the business relationships and verify the same with the subject matter experts.
- Create logical models using Erwin (version 8.2.5) and Conduct walk throughs with Business Stakeholders. Document Assumptions and Design goals.
- Create and Implement physical data model
- Develop SQL Scripts to populate the POC data structures and create reports using SQL to validate the completeness of the POC model.
- Created Perl Script that can create excel workbook with results from various database queries stored on separate worksheets. There is a provision to provide a custom name to each of the worksheets in the workbook. This greatly improved productivity and help provide consistent report format to the clients.
Tools: Erwin (8.2.5), SQL Developer, Ultra Edit, MS Visio, MS Word, Oracle 11g, Teradata, SAS, PL/SQL, Perl Scripting, Tableau Front end visualization tool.
Oracle Data Modeler & Developer
- Conversion of huge non-partitioned tables to Partition tables. This will help manage data retention requirements set by Business. Created a framework to copy data during offline hours to partition tables. On the day of production, the new partition table is compared with the non-partition table and reconciled for any differences. Used DBMS Jobs to run comparisons on multiple threads to compare roughly 400M rows in 1.5hrs time.
- Used Erwin to reverse engineer Citi Direct Classic data model
- Designed and developed a framework for purging of heavy volume tables. Frame work to support running the operations during off hours, use of Collections to improve performance and ability to restart from where the process was left off and provisions for error handling etc;
- Performance Tuning of complex queries and provide suggestions to the application teams on necessary code changes where applicable.
- Working to create a framework to ensure data is replicated accurately by Golden Gate utility between CitiDirect Production Database and the to be deployed 10G version of the database. Framework supports comparing a slice of data between the two databases for any given table and records the differences. Queries are built dynamically and executed by a shell utility.
Tools: Erwin, Toad, Ultra Edit, Perl, MS Visio, MS Word, Oracle 9i/10g, PL/SQL, Perl Scripting
Data Modeler Consultant
- Prepared logical and physical models using Erwin to meet Vesting Schedule requirements; The model was used to create a rules engine to identify the number of stocks vested for any given grantee;
- Designed a home-grown data profiling solution and implemented to help identify data anomalies and sent automatic emails to the concerning parties;
- Promoted the use of Oracle collections to provide consistent results across various BOL screens showing grant details;
- Coordinate with various groups such as QA, Release management, DBAs to promote code through different test environments and finally to production.
- Involved in performance turning for existing queries using explain plan, TKProf Utilities
- Automation of database jobs using UNIX shell scripting and cron utilities.
Environment: Erwin 4.1.4/7.x, Toad, Perl, PVCS, ClearQuest, MS Visio, MS Word, Oracle 10g, PL/SQL, Unix Shell Scripting
Data Modeler/Architect consultant
- Involved in the Phoenix Centralized Data Integration project to implement a enterprise level data warehouse with data sourced from various regional systems
- Involved in preparing Star and Snowflake data models using Erwin, Process flow diagrams, Source-to-Target mappings.
- Responsible for creation of Revenue Master Data mart by consolidating revenue feeds from Confidential accounting systems across various regions. Extracted and transformed data from multiple sources (e.g. Relational, flat file) and loaded into Oracle
- Primarily responsible for introduction of Account concept with in Confidential Revenue system.
- Data modeler for GCAP (global account and client profitability) system, which allows Confidential to understand Profitable Clients.
- Extracted and transformed data from multiple sources (e.g. Relational, flat file) and loaded into Oracle
- Developed Oracle Stored Procedures, Functions, and Packages, to aid the ETL processes
- Responsible for object level migration from development to QA and Production repositories
- Created several concept documents when new architecture frameworks were introduced to facilitate implementation of business processes. The concept documents standardize definitions of the business terms and help every team member across all projects with the same understanding. This in turn reduced the need for number of meetings required to socialize the concepts.
Environment: Oracle 9i, Informatica PowerCenter 8.x/7.x, UNIX, Windows XP, Business Objects 10, Crystal Reports XI, Toad, Erwin 4.x, PL/SQL, SQL Plus, Shell scripting, Visio
Enterprise Data Modeler consultant
- Prepared Business Requirements, Logical & Physical data model using Power Designer, Capacity Plan, Process Flow Diagram, Source-To-Target mappings, Release Notes etc.
- Developed processes to load global composites/bench mark history/issuer credit rating data.
- Responsible for the creation of data mart for client monitor data
- Load data using SQL. PL/SQL, Analytical functions, SQL*Loader, Stored procedures & packages.
- Wrote data validation routines to validate data between Source, Stage & Data mart areas.
- Optimization and tuning query performance by using SQL Analytical functions, Bitmap indexes on non-measure columns, analyzing tables and indexes to gather statistics, use of parallel hints. Tuning the ETL load by creating tables in Nologging mode, use of direct path inserts and Parallel options.
- Work closely with DBAs and SAs to manage free space for UNIX volumes and tablespaces. Monitoring and scheduling Production batch loads, managing DB-links, roles & users. Managing separate Locally managed uniform allocation tablespaces for large, medium & small sized tables and indexes.
- Wrote shell scripts for complete ETL process - Truncate tables, Drop Indexes, Load, Create Indexes & Analyze tables, Data validation routines, Export/Import data between ETL-Production & Production environment.
Environment: Oracle 9i/8i, Power Designer, Visio, PL/SQL, SQL Plus, SQL Loader, Shell Scripting, Toad, SQL-Navigator, Crystal Reports, Unix, Windows NT/2000
Oracle Data Modeler / Developer
- Actively contributed for the design of the data model in a number of design discussion meetings
- Design and development of database procedures to receive, process and store data from the excel spread sheets to the Oracle database
- Design and development of database procedures for accessing the measure data from a intranet web site
- Development of database procedures to help users access measure data using Excel
- Coordinating end-to-end testing of the application
- Production support
Environment: Oracle 8i, ERwin 4.x, Visio, PL/SQL, SQL Plus, SQL Loader, Shell Scripting, Toad, SQL-Navigator, Crystal Reports, Unix, Windows NT/2000, CVS
- Designed a generic communication manager using TCP/IP protocols. The application is responsible for sending and receiving AOF messages to/from other communication managers. An SDF is used to provide the configuration parameters such as IP address and port number.
- Working as a team member in SIAC’s Application Support group. Responsibilities include: Maintaining existing AOF applications and creating functional specifications.
- Created Order Processing Simulators to help SIAC participate in an industry wide Y2K testing.
- Presently working on an application called ‘Direct Quote and Sale Report’ to provide quotes and executions report to the ticker tapes.
Environment: VAX - 8600, ‘C’, DCL, VMS Internals, Sybase Transact-SQL for Stored Procedures
- Worked as a lead developer in multi-tier Client/Server application involving Internet, Tuxedo Transaction Monitoring on UNIX and an Oracle database running on VMS. The applications called Bluelist Automated Input The purpose is to provide easy access to the subscribers of JJ Kenny's BlueBook to the company's Blue List Information Systems (BLIS) database. The application allows the subscribers to view and update the lineage information interactively from their desktops.
- Designed and developed several tuxedo services that select the lineage as per the selection criteria provided by the users and also to update the lineage.
Environment: UNIX, Pro*C, 'C', Oracle PL/SQL, Tuxedo Lib for transaction services