Consultant Resume Profile
SUMMARY:
- More than twenty years of experience in engineering, product management, professional services, data warehouse, business intelligence, design and architecture of mission critical OLTP systems on commercial and open source platforms.
- Extensive experience in deployment of large scale BI solutions on relational, Columnar and BigData platforms.
- Architected and implemented Pentaho BI, Pentaho ETL Kettle, Mondrian , Talend Open Studio based DW and BI solutions. Architected and implemented Informatica, Oracle, Oracle Warehouse Builder and Oracle Business Intelligence OBIEE based BI solutions. Data visualization expertise in Tableau, OBIEE and Pentaho Analyzer.
- Expertise in BigData platforms Hadoop, Hive, HBase, Google BigQuery. Architected solutions that process massive amounts of data on corporate and AWS/cloud based servers.
- Logical and physical data modeling using Erwin, Oracle Designer, MYSQL Workbench, S-Designer, ER Studio, Power Designer. Schema design for BigData solutions based on HBase.
- Architecture for solutions based on combination of open source commercial platforms like Hadoop, Hive, Oracle, MYSQL, PostGreSQL, Google BigQuery, HBase, Pentaho, Phoenix, GreenPlum and others
Highlights
- Team Lead for teams of 5-10 developers. Managed very large scale global projects for multinational companies. Solution Architect for global team. Data warehouse / data architect for very large scale data warehousing projects
- Subject area expertise in architecting data warehouse/BI solutions based on data from enterprise applications like Oracle Applications, SAP, PeopleSoft, Matrix, Clarify etc
- Expertise in implementing highly optimized backend code Package, procedure, functions for very high volume customer facing web sites
- Expertise in implementing data warehouse and business intelligence solutions using open source platform MYSQL, Talend Open Studio and Pentaho
- Project management of Offshore projects, headed teams in USA and India.
- Data modelling for OLTP and OLAP STAR Schema systems using Erwin, Process modelling using Oracle designer and BPWin.
- Responsible for data warehouse architecture, ETL team lead. Designed / architected framework for real time data warehouse load from multiple data sources
- Architected data models for very large scale data warehouses with tera bytes of storage used Ralph Kimball's methodology . Did capacity planning for database and database objects to handle large volumes of data
- Expert level knowledge of data warehousing and business intelligence concepts
- Extracts to load data into data warehouse from operational systems and third party data files using Oracle warehouse builder, Informatica, PL/SQL and Perl. Flat file parsing routines in Perl.
- Strong data warehousing concepts knowledge. Data warehouse development using INFORMATICA and Oracle Warehouse Builder
- Data flow diagrams, ER Diagrams, CRUD matrix using Oracle Designer
- Logical and physical database design for Oracle and Sybase based systems. Physical parameter setting for database objects for optimal performance
- Requirement Analysis, High Level and Low level design. Wrote specifications for back-end and front-end development, created technical design documents
- Wrote back-end Packages, Procedures, Functions, Triggers used SQL-Programmer. Scripts for database objects/schema creation, automation of data loads. Expert in PL/SQL
- SQL statement tuning used Explain Plan, TRACE and TKPROF, also used SQL-Expert tool
- Creation of schemas for development, test and production environments, database optimization
- Strong Oracle database server administration
- Performance tuning at storage, operating system, database and application levels
- Setting up the backup and recovery strategies
- Production Database Support
- Strong Programming skills. Front-End development using Oracle Forms 4.5/5/6i, Oracle Reports 2.5/3/6i, Power Builder 5.0/6.5, Oracle WEB-DB
- Strong knowledge of Oracle development and design tools
- Have working knowledge of Web Technologies including Java, ASP, JSP, JavaScript, VB Script and HTML
- Have knowledge of Microsoft .NET technologies including Windows Forms, VB.Net, ASP.Net
- Making estimates and plans for assigned work, prototype development, writing specifications for forms and reports development, setting up coding standards to be used through out system development
- Data migration from old databases, file based legacy systems and Sybase to Oracle data conversion
- Design, development and performance tuning of e-commerce database applications on Oracle and MYSQL platforms
- Installation, configuration and administration of Oracle9i Application server, programming using mod plsql and PSP, creating and managing portals using Oracle9i Portal product, writing database portlets
- Expertise in OBIEE 10.1.3.3.3/10.1.3.4 and Siebel Analytics 7.5/7.7/ 7.8.4, Informatica 8.6/8.1.1/7.1.4/6.x.
- Extensive experience in OBIEE Administration Tool, OBIEE Answers, OBIEE Intelligent Dash boards, BI Publisher, Delivers and iBots with OBIEE Applications.
- Experience in installation and configuration of OBIEE, DAC and administering
Skill Set:
- Database Tools
- Oracle 11g, Oracle 10g, Oracle 9i, Oracle 8i, Oracle 8, Oracle 7.x, Oracle 9i Application Server, Oracle WEB-DB, Oracle Migration Workbench, Sybase System XI, Designer 2000, Developer 6i Forms 6i and Reports 6i , FORMS 4.5, REPORTS 2.5, PL-SQL, PRO C, SQL PLUS, PowerBuilder 6.5/5.0, Erwin, Oracle Enterprise Manager, Oracle, MYSQL Warehouse Builder, Informatica, MicroStrategy, Talend Open Studio, Pentaho Kettle, PostGreSQL, PostGIS, GeoKettle, Power Designer, Oracle Designer, MYSQL Workbench, Tableau, Pentaho Schema Workbench, Mondrian, Hyperion
- Operating Systems Networks
- Linux, AIX, SCO-UNIX, HP-UNIX, SUN SOLARIS, WINDOWS-NT, NOVELL NETWARE, LAN WORKPLACE, TCP/IP
- Languages Packages
- C, BUSINESS BASIC, HTML, ASP, JavaScript, JSP, INFO REPORTS, IDOL IV QUERY, PLATINUM DESKTOP DBA, SQL-PROGRAMMER, TOAD, VISUAL BASIC, VB SCRIPT, PERL
Work Experience:
Projects on pentaho platform
Confidential
Environment: Unix, MYSQL, Pentaho Kettle, SQL, Toad for MYSQL, Excel data, CSV data, Oracle, Pentaho Reports, Pentaho Analyzer, Mondrian OLAP, Hadoop, Hive, Google BigQuery
Project summary: the project is related to support worldwide support for reporting and analysis needs on the Organization.
Responsibilities
- Analysis of data related to proposals, opportunities, orders, invoices, payments, cancellations etc
- Implemented best practices for architecture of enterprise data warehouse which supported reporting solutions based on Pentaho platform. The reporting platform was used globally by a very large user base
- Architected dynamic incremental load architecture and periodic snapshot architecture
- Architected BI solution based on Pentaho platform. Components used, Pentaho Reports, Pentaho Data integrator, Pentaho Analyzer, Mondrian cubes
- Complex reports integrated into the Java based application
- Reports utilized multiple parameters which were used to support many layers on hierarchies, and multi currency system
- Report query dynamically created based on parameters
- Drill functionality in reports using dynamic URLs, custom sorting in reports to allow user to sort data on any field in the report
- Customized look and feel to match with parent Java application
- Architected the entire data warehouse and ETL loads for the project
- Created multiple cubes in Mondrian to support data analysis
- Data visualization using Tableau
- ETL management, tuning and support
- Complex MRR calculation for entire customer base
- Ragged hierarchy support
- Dynamic data bucketing in reports for selected time period and other dimensional attributes
- Extensive use of custom JDBC due to dynamic nature of reporting needs
- Processed machines generated data using Hadoop, Hive and Pentaho. Summarized data was further loaded into MYSQL for reporting
- Proof of concept using Google Big Query to process large data files
Confidential
Environment: Unix, MYSQL, Pentaho Kettle, SQL, Toad for MYSQL, Excel data, CSV data, Oracle, Hadoop, Hive
Project summary: the project was related to study funded by Bill Gates foundation for finding out ways for effective teaching which can help students improve in ELA, Maths and Biology. A large number of school districts participated in the study. I was involved in managing/processing of school roster data, loading of student, class, teacher, teacher assignment, and test score data from flat files into MYSQL database. Matching of volunteer data with roster data and extraction of data to CSV files for further processing.
Responsibilities
- Analysis of data related to volunteer teachers, participating students, courses, classes, assignments, tests, test scores
- Requirement analysis, related to reporting and data management needs. Created technical design documents, created source to target mapping document. Analysis of source system data
- Logical and physical data modelling using Erwin
- Documented data cleansing needs and procedures for cleaning source data
- Documented data migration strategy, ETL architecture and design
- Created Pentaho Jobs and transformations to load data from CSV, Excel files to MYSQL database
- Created Jobs and transformations in Pentaho Kettle to move Oracle data pump files
- Automated FTP and processing of student, teacher, roster, classes, courses etc data from remote site to local site
- QA of data processed thru ETL process
- Troubleshooting of issues finding the issue source and resolution
- Wrote complex SQL queries to move data form staging area to dimensions and fact tables
- Worked closely with database architect for database design
- Performance tuning of the data load process
- Created database objects like Tables, indexes
- Created referential integrity among database tables
- Data load from file to Hadoop cluster, BigData summarization using Hive using Pentaho and loading of summarized data into Oracle and MYSQL for data visualizations and ad-hoc reporting
Confidential
Role: Consultant
Environment: Linux, PostGreSQL, SQL, Pentaho Kettle, CSV, Excel, Data integration, Reporting
Project summary: Confidential is a website which provides a platform for students to prepare for SAT and CAT exams. Individual students, school districts and schools are main users of the service. This project was related to management and analysis of data related to student performance, teachers, subjects, online courses, performance of courses and classes offered by PrepMe.com, student activity, student test scores, student attendance online logins
Responsibilities
- Requirement analysis, created technical design documents, created source to target mapping document.
- Analysis of source data students, courses, teachers, schools, tests, test scores, user, user login, lessons
- Designed the database to store data related to students, courses, lessons, teachers, schools, tests, test scores, user, user login etc
- Loaded data into staging tables from OLTP production systems
- Data cleansing, data association with master data
- Created logical and physical data models
- Designed and developed Pentaho jobs and transformations to load data into dimensions and facts
- Review, correction and verification of student records
- Documentation of the entire process which was used to load and aggregation of data
- Created reports/analysis related to
- Students who recently enrolled
- Students who are not reading the online lessons/courses
- Improvement in student scores in various subjects
- Student, course, lesson analysis by school, school district over a period of time
- Students who have not logged into the system for a long time
- Information related to first login, last login, previous test scores comparison with current test scores
- Worked on UAT and QA
- Setup of Pentaho Kettle repository and management in MYSQ database
Confidential
Environment: Unix, MYSQL, Pentaho Kettle, SQL, Toad for MYSQL, Excel data, CSV data, Pentaho BI platform, Pentaho Metadata editor
Project summary: Motormouths.com is a website which provides ranking information for various makes and models. The service helps car buyers in decision making process by reading critics point of view about the car. The system also provide ranking/score to the car
Responsibilities
- Analysis of data related to makes, models, vehicles, critics, publishers etc.
- Designed and created Pentaho Kettle based ETL platform to load data from various data sources to MYSQL based data warehouse
- Data modelling for STAR schema
- Architected business model for Pentaho solution using Pentaho meta data editor
- Implemented Pentaho based reporting and Ad-hoc reporting analysis platform
- Configured Pentaho BI and Pentaho data integrator Kettle on cloud server
- Worked closely with analysts on needs related to data analysis
Confidential
Environment: Unix, Oracle 10g, OBIEE 10.1.3, Oracle Applications EBS , SalesForce.Com, INFORMATICA, TOAD, Data modelling, SQL, Erwin, Performance tuning, Real time data warehousing, Data integration, SQL Server, Operational data source, web services
Responsibilities
- Requirement analysis, created technical design documents, created source to target mapping document. Analysis of source system data Oracle EBS, SalesForce.Com, HR
- Architecture for customized solutions related to Finance, Sales, Marketing and Operational business intelligence on Informatica, Oracle 10g and OBIEE platform
- Architecture of subject areas supporting reporting for Revenue trend world wide, Revenue by product line, Global financials, Global cash cost, Net income detail balance sheet, Financial KPI metrics Market presence, Revenue, Margins, Costs, Headcount and Operational reports related to IBX, service requests etc
- Data modelling STAR schema design and architecture, created new facts and dimensions and used out of the box OBIEE facts and dimensions related to Financial analytics
- Architecture and design of business model. Identified Facts, Dimensions, and Hierarchies of the business model. Created complex calculated measures based on time dimension to calculate Quarter Ago and Year Ago metrics
- Identified the granularity level of the business model based on the end user requirements.
- Extensively used OBIEE Administration Tool for Customizing and modifying the physical, business, and presentation layer of the repository
- Worked closely with business analysts and stake holders
- Lead team of OBIEE developers, created technical design documents related to ETL and report development
- Architected incremental ETL load strategy, architected data aggregation strategy
- Architected and implemented Operational data store ODS to support various customer facing web services and operational reporting
- Performance tuning at database, ETL and SQL level
Confidential
Environment: Linux, Oracle 10g, INFORMATICA 8.6, TOAD, Data modelling, SQL, Erwin, Performance tuning, Real time data warehousing, Data integration, SQL Server
Responsibilities
- Requirement analysis, created technical design documents, created source to target mapping document.
- Architected deployment of Informatica for production environment, worked with DBA and Unix admin teams
- Installed and configured Informatica 8.6, managed Informatica repository, configured repository and integration services, created and managed users, folders and security
- Architected near real time data integration solution on Oracle and Informatica platform
- The implemented solution will move data from thousands of sources to thousands of targets. Dynamically created parameter files using ETL metadata and mappings, implemented incremental load architecture
- Created mappings to move data, created worklets and workflows, created workflow schedules
- Architected ETL metadata to support data integration among sources and targets using a set of shared mappings and worklets
- Designed and implemented error handing and restart strategy
- Implemented dynamic data partitioning strategy using Oracle table partitioning
- Performance tuning of ETL maps and SQL
- Architected Metadata to manage ETL loads, wrote stored procedures, packages to manage the ETL metadata
- Architected solution to dynamically apply source and target connections, SQL override and log file location at run time
- Implemented SQL transformation for moving data from dynamic number of related source tables
- Implemented strategy to migrate ETL solution from development to QA and QA to production environment
- PL/SQL coding to implement complex business rules
Confidential
Environment: Linux, MYSQL, Pentaho Kettle Pentaho Data Integrator , TOAD, Data modelling, SQL, Erwin, Performance tuning, Real time data warehousing
Responsibilities
- Data warehouse /business intelligence architect/designer/developer. Requirement analysis, high level and low level design. Working with business users to understand business needs and converting those to technical specifications.
- Architected and implemented real time data warehouse/BI solution for email based advertising system
- Created incremental load strategy to provide real time data for operational reporting on various events created by receiver of email
- Performance tuning of the SQL and ETL platform, using massive parallel processing
- Designed and implemented data loading mappings from source system to MYSQL based data warehouse using Pentaho Kettle open source ETL tool
- Data modelling for data warehouse STAR Schemas using Erwin
- Shell scripting for execution of ETL loads
- Data partitioning strategy
Confidential
Environment: Linux, Oracle 9i, Oracle 10g, Informatica, Siebel Analytics/OBIEE, SAP, SQL server, TOAD, Oracle Enterprise Manager, PL/SQL, SQL, Unix Shell scripts, Data modelling
Responsibilities
- Data warehouse /business intelligence architect/designer/developer. Requirement analysis, high level and low level design. Working with business users to understand business needs and converting those to technical specifications.
- architected enterprise data warehouse solution to load data from sap and other applications
- Data modelling for data warehouse STRA Schemas using Erwin
- Architected incremental load strategy to load data using Informatica ETL tool. Wrote code to create parameter file to facilitate almost real time data loads into data warehouse.
- Designed and developed Informatica mappings for type1 and type2 dimensions and facts
- Loaded data from flat files, SQL server and SAP data sources
- Analyzed SAP data and created source to target mappings to load sales, customer, inventory, material and purchase order data from SAP
- Created STAR schema and real time ETL to load search data from e-commerce website.
- Database migration from File Maker Pro to Oracle 10g
- Performance tuning, working with Siebel analytics/OBIEE developers for helping on report analysis and data availability
- Tuning on Informatica mappigs/sessions. Informatica repository management
- Performance tuning of queries and PL/SQL code. Extensively used explain plan, trace and tkprof utilities
- Worked with DBA team for database related performance tuning, database tuning
- Built the Physical Layer /Business Model and Mapping Layer/ Presentation Layer of a Repository
- Created Dimension Hierarchies and Level-Based Measures based of business requirements
- Defined and created aggregate table for better performance
Confidential
Environment: Linux, Oracle 9i, Oracle 10g, Informatica, Business Objects, Sybase ver 12.5, Oracle9i and Oracle 10g, TOAD, Oracle Enterprise Manager, PL/SQL, SQL, Unix Shell scripts, Data modelling, Oracle Applications, PeopleSoft, Oracle Expense
Responsibilities
- Data warehouse /business intelligence architect/designer/developer. Requirement analysis, high level and low level design. Working with business users to understand business needs and converting those to technical specifications.
- Database migration from Sybase 12.5 to Oracle 10g
- Database migration from File Maker Pro to Oracle 10g
- Data modelling for data warehouse STAR schema . Various data sources, Oracle financials, PeopleSoft, TimeCard, Computrition, Infogenesis, Oracle expense, Reserve, Student housing
- Performance tuning, working with business objects developers for design of business objects universes
- Performance tuning of queries and PL/SQL code. Extensively used explain plan, trace and tkprof utilities
- Worked with DBA team for database related performance tuning, database tuning
Confidential
Environment: Linux, Oracle 9i, Oracle 10g, Oracle Warehouse Builder 10g release1 and release2, TOAD, Oracle Enterprise Manager, PL/SQL, SQL, Unix Shell scripts, Data modelling, Talend Open Studio
Responsibilities
- Data warehouse /business intelligence architect/designer/developer. Requirement analysis, high level and low level design. Working with business users to understand business needs and converting those to technical specifications.
- Data modelling STAR schema design for various data warehousing projects
- Source to target mappings to load data from various in house and third party data sources
- Implementation of complex business rules in PL/SQL procedures and functions and implementation of those as user defined transformations in Oracle warehouse builder mappings
- Stage based ETL architecture design for incremental data loading
- Automation of ETL loading parallelism and dependency using Oracle warehouse builder process flows
- Implementation of process flows in Oracle workflow tool
- Design and development of very complex mappings in Oracle warehouse builder
- Performance tuning of queries and PL/SQL code. Extensively used explain plan, trace and tkprof utilities
- Worked with DBA team for database related performance tuning
- Migration of mappings from dev to test to prod
- Worked with QA team during the QA process
- Management of Oracle warehouse builder repositories for development and production
- Scripting in OMB Plus for daily backup of repostories
- Migration of Oracle warehouse builder 10g release1 project to Oracle warehouse builder 10g release2
- Use of table functions as source in OWB mappings
- Implemented Oracle APEX application which allowed marketing department to enter marketing spend data. This data was later used by data warehousing process to allocate marketing spend against sales. Reports related to marketing spend by region, date and advertising group.
- Created data integration platform based on Talend Open Studio tool. The platform allowed affiliates to load large data files on demand, extraction of data from Oracle data warehouse to CSV files which were further used by downstream systems. Data extraction related to sales financial transactions. Data load into data warehouse from disconnected data sources like affiliate data files etc
Confidential
Environment: Sun Solaris, Linux, Oracle 9i, TOAD, Oracle Enterprise Manager, Oracle Management server, PL/SQL, Unix Shell scripts, Oracle APEX
Responsibilities
- Requirement gathering related to SOX compliance of Oracle databases, Oracle Applications and other applications running at Juniper Networks.
- Architected solution to audit actions done by database users in production database
- Architected and implemented on demand right access to production databases
- Worked directly with the SOX compliance team and SOX auditing team
- Helped in creation and implementation of SOX controls related to oracle database, oracle Oracle applications and other applications.
- Implemented reporting environment to track unauthorized access to production databases
- Helped DBA team in implementation of best practises for database related tasks as per SOX
- Implemented various roles and assigned these roles to user as per SOX guidelines.
- Implemented best practices for DBA team related to daily tasks, DR testing, access controls etc.
- Implemented centralized reporting system using Oracle APEX. Auditing reports were done, which could be filtered by date and user
Confidential
Environment: IBM AIX, Oracle 9i, Oracle Warehouse Builder, Hyperion Essbase , Essbase Analyzer, Brio, TOAD, Oracle Enterprise Manager, Oracle Management server, Oracle streams, PL/SQL, Unix Shell scripts, Data modelling ERWIN
Responsibilities
- Data warehouse /business intelligence architect, technical lead for ETL team. Requirement analysis, high level and low level design. Working with stake holders and project manager to define and manage deliverables. Responsible for planning production rollout and management of production system. Working with other team leads to facilitate coordination between BI/ETL and other teams
- Data modelling using Erwin STAR schema for very large scale data warehouse having multiple data sources. Design of dimensions and facts using Ralph Kimball's methodology
- Summarization of data to load Hyperion cubes. Architected the process of monthly summarization to load cubes
- Designed the framework for incremental and on demand data warehouse refresh on Oracle 9i and Oracle Warehouse Builder10g platform
- Did data mapping documents. Architected workflow to run daily and monthly loads
- Installation and configuration of Oracle Warehouse Builder repositories and Oracle work flow
- Management of daily and nightly loads. Shell scripting and cron jobs to execute work flow process flows.
- Scripting in OMB Plus to export Oracle Warehouse Builder repository. Also wrote scripts to import and export specific repository objects
- Design, development and implementation of very complex mapping to load data into slowly changing dimensions and facts. Used Table functions extensively to get good performance.
- Worked on database parameter settings to increase performance. Performance testing of load mappings using extended trace and tkprof. Query optimization using various hints
- Replication using Oracle streams
Confidential
Environment: Sun Solaris, Oracle 9i and Oracle tools, Informatica, PL/SQL, FirstLogic, Unix Shell scripts, Data modelling ERWIN , Toad, Oracle Enterprise Manager, Oracle warehouse builder, Oracle applications 11i
Responsibilities
- Design of extracts to load data into customer data warehouse from Oracle applications, Matrix, Clarify and other operational systems
- DNB data integration and load into data warehouse
- Design and development of customer analytics
- Data modelling for data warehouse and data marts using Erwin, logical and physical database design. Also responsible for starting the data modelling practice in the company. Set up standards and best practices for data modelling.
- Wrote extracts in Oracle warehouse builder and Informatica to load data into data warehouse from Oracle applications Orders, Invoice, Customer data, GL data , Clarify, Matrix, Partner exchange, DNB and other data sources
- Data integration among different databases using materialized views and bridging procedures
- Solution architect and member of the global architecture team at HDS.
- Design of core consolidator process to cleanse and de-dupe sites and contacts data using FirstLogic tool. Design of automated process to do data loading from operational systems into data store, customer data merging thu FirstLogic and final data loading into data warehouse
- Design and development of RFM, RRR and RFF marketing models to get rich marketing intelligence from customer data warehouse
- Application tuning and performance monitoring, Setup and maintain documentation and standards, Planning growth and changes
- Writing scripts for creating tablespaces, rollback segments, tables, synonyms, roles, indexes, views, constraints, database tuning by deciding appropriate table sizing parameters, user maintenance. SQL statement tuning used Explain Plan TRACE and TKPROF, database optimization, object parameter settings, snapshot creation and management.
Confidential
Environment: HP Unix, Oracle 9i and Oracle Tools, Unix Shell scripts, PL/SQL, Data modelling ERWIN , PERL, Maestro, Batchnet, PL/SQL Cartridge for web development, SQL-Programmer, SQL-Expert, Oracle Enterprise Manager, Oracle warehouse builder
Responsibilities
- Requirement Analysis, High-level design, Low-level design for data warehouse and operational data store
- Data modelling using Erwin, logical and physical database design
- Wrote specifications for back-end development, created technical design document
- Developed Packages, Procedures and functions in PL/SQL, wrote triggers for implementing data integrity
- Wrote extract programs using PERL to extract data for downstream data warehouses, data load from flat files into data warehouse and operational data store
- Data extracts from operational systems using Oracle warehouse builder
- Creation of web site using PL/SQL web cartridge
- Integration of Oracle Applications data
- Data integration among different systems using Oracle XML DB
- Understanding the existing business logic of Compaq applications and merging them into HP applications
- Design and development of operational data store to load data from HP vendors and then transforming the data to HP standards, processing of EDI files, worked on sales and marketing data warehouse. Data load into data warehouse from operational systems. Extracts for downstream data warehouses
- Application tuning and performance monitoring, Setup and maintain documentation and standards, Planning growth and changes
- Writing scripts for creating tablespaces, rollback segments, tables, synonyms, roles, indexes, views, constraints, database tuning by deciding appropriate table sizing parameters, user maintenance.
- SQL statement tuning used Explain Plan TRACE and TKPROF, database optimization, object parameter settings, snapshot creation and management.
Confidential
Environment: HP Unix, Oracle 8i and Oracle Tools, Unix Shell scripts, PL/SQL, Data modelling ERWIN , Oracle Forms 6i, Oracle Reports 6i, SQL-Programmer, SQL-Expert, Oracle Enterprise Manager, DBA Assistant, Oracle warehouse builder
Responsibilities
- Requirement Analysis, High-level design, Low-level design
- Data modelling using Erwin, logical and physical database design
- Wrote specifications for back-end development, created technical design document
- Developed Packages, Procedures and functions in PL/SQL, wrote triggers for implementing data integrity
- Designed, developed and implemented Sales Tracking and Reporting System in Oracle8i/HP Unix / Oracle Forms and Reports 6i environment. Apart from its core use as a sales tracking system for sales and planning team, this system also acts as data mart for corporate data warehouse
- Created snapshots for implementing distributed data support, also used materialized views to boost performance
- Application tuning and performance monitoring, Setup and maintain documentation and standards, Planning growth and changes
- Writing scripts for creating tablespaces, rollback segments, tables, synonyms, roles, indexes, views, constraints, database tuning by deciding appropriate table sizing parameters, user maintenance.
- SQL statement tuning used Explain Plan TRACE and TKPROF, database optimization, object parameter settings, snapshot creation and management.
- Creation of schemas for development, test and production environment.
- Responsible for developer DBA activities
- Creation of users and roles in Production environment. Assigning proper role to users
- Front-end development of STARS application using Oracle Forms 6i and Reports 6i
- Working with data warehouse team for setting up materialized views and writing data feed procedures, setting up CRON jobs, Shell scripting in KORN shell
- Data load from third party flat files and operational systems using Oracle warehouse builder
Confidential
Environment: Sun Solaris, Oracle 8i and Oracle Tools, Unix Shell scripts, PL/SQL, Data modelling ERWIN , Oracle Designer, Oracle Forms 6i, Oracle Reports 6i, WEB-DB, SQL-Programmer, SQL-Expert, Oracle Warehouse Builder, MicroStrategy, Oracle Enterprise Manager, DBA Assistant, Oracle Discoverer, Oracle Applications
Responsibilities
- Project Management, Requirement Analysis, High-level design, Low-level design
- Data modelling using Erwin, logical and physical database design
- Wrote specifications for back-end development, created technical design document
- Developed Packages, Procedures and functions to be called from Java front-end to support online web-store
- Designed and implemented a full lifecycle production, development and test environment for Vendor Integration, database setup for Web Store in Oracle8i/Sun Solaris environment.
- Database design STAR Schema for Data Mart application for click stream analysis and web site visitor behavior tracking. Used Erwin for STAR schema design, Oracle Warehouse Builder to create load scripts and MicroStrategy for Reporting
- Extracted Order, Invoice, Customer data from Oracle financial and integrated with the data warehouse
- Working with DBA team
- Streamline operations and database performance Optimisation
- Installation, configuration and upgrading of Oracle server software and related products
- Taking care of the Database design and implementation, implementing database security
- Create and maintain users and roles, assign privileges etc , performing database tuning and performance monitoring
- Performing application tuning and performance monitoring
- Setup and maintain documentation and standards, Planning growth and changes
- Sizing tablespaces, indexes, rollback segments and tables etc.
- Writing scripts for creating tablespaces, rollback segments, tables, synonyms, roles, indexes, views, constraints, database tuning by deciding appropriate table sizing parameters, user maintenance.
- SQL statement tuning used Explain Plan TRACE and TKPROF, database optimization, object parameter settings, snapshot creation and management.
- Creation of schemas for development, test and production environment.
- Export of schema to create objects in production environment.
- Creation of users and roles in Production environment. Assigning proper role to users
- Created Web based reporting system using Oracle WEB-DB, also created Oracle Forms and Oracle Reports in Developer 6i
- Shell scripting in KORN shell
