- Extensive experience in projects on Data Mining & Data Warehousing.
- Maintained database performance by identifying and resolving production and application development problems; calculating optimum values for parameters; evaluating, integrating, and installing new releases
- Involved in various projects related to System/Data Analysis, Design and Development for Data warehousing environments.
- Comprehensive knowledge and experience in process improvement, normalization/de - normalization, data extraction, data cleansing, data manipulation.
- Established and maintained comprehensive data model documentation including detailed descriptions of business entities, attributes, and data relationships.
- Developed mapping spreadsheets for (ETL) team with source to target data mapping with physical naming standards,data types, volumetric, domain definitions, and corporate meta-data definitions.
- Exceptional communication and presentation skills and established track record of client interactions.
Languages & Technologies: Hadoop stack(Hive,Pig,Oozie,Zookeeper),Impala Ab Initio GDE (3.2.1)/ Metadata Hub/Operational Console, Python Shell,Unix Shell
Databases: MySQL,Oracle 9i 10g, Teradata 12,DB2
Miscellaneous skills: ER Studio, Microsoft Sharepoint Server 2003, HTML, Microsoft SQL Server 2008
Confidential, Foster City, CASystem/Data Analyst
- Develop and design optimal data models and applications to expose data to consumers in optimal fashion
- Design, develop and own reports and dashboards
- Identify, analyze, and interpret trends and patterns in complex data sets and categorizing users based on spend behaviors.
- Data mining for third party clients to their increase market share.
- Design, develop, document and implement new programs and subprograms, as well as enhancements, modifications, and corrections to existing applications.
- Create documentation and procedures for installation and maintenance.
- Build and maintain relationship with global and virtual teams and third parties on software development or support issues.
- Work closely with project team members (Analysts, Developers, Project Managers, Product Managers, etc.), and coordinate with other groups (e.g. Network, QA, Operations, Engineering etc ).
- Identify opportunities for further enhancements and refinements to standards, best practices and development methodologies.
- Support and deliver on resulting initiatives from the Business Plan and Technology strategy leveraging re-use, eliminating duplicative components, architectural design, use of innovative products.
- Work to ensure technology decisions are "business" driven.
- Suggest and cultivate ideas on the extension of the current programs to new products and services.
Technology :Ab Initio,DB2
- Impact analysis of all the currently running Confidential Extras processes to be decommissioned.
- Coordinating with the upstream and downstream processes to handle their relevant processes efficiently without causing any stoppage in other business processes.
- Create documentation and procedures pertaining to the decommissioning.
- Work closely with other team members of the global scheduling team(GSCHED) and global production support team(GPRODSUPP) to insure smooth decommissioning.
Technology :Mainframes,Unix,Ab Initio Metadata Hub
Confidential, Foster City, CASystem/Data Analyst
Technology Abinitio GDE 3.17+,Abinitio Metadata Hub,Op-Console,Unix Shell scripting
- As a part of the Enterprise Metadata Management team within Confidential, involved in various projects dealing with enrichment and extraction of the metadata from existing processes, thereby facilitating business users indecision making process.
- Developed a software monitoring component to manage nightly software builds, alerts and escalations to the respective owners and their SMEs.
- Automated multiple data flow processes to optimize the work load and performance.
- Developed dashboards depicting the enterprise wide metadata quality for business users thereby helping them to in decision making.
- Improved existing workflow of metadata extraction, refresh and simplified the data integration processes.
- Developed various Ab initio graphs and unix shell scripts thereby optimizing the existing processes.
- Created multiple daily, weekly and adhoc jobs to facilitate existing processes using Operational console from the Ab initio package.
- Actively contributed in view customizations and updates to the Metadata Hub front-end based on business needs and feedbacks.
- Created Microsoft sharepoint site from scratch to facilitate a single repository location to save metadata documents,user guides and documentations on existing metadata processes.
- Created exception handling system for managing whitelists using Nintex workflow feature of Microsoft sharepoint server.
- Development of dashboards, user interface improvements and automated monitoring
- Creation of the data set/DBC/Physical table best practices to be included in the CMLS best practice guide.
Technology Ab initio,Teradata,Unix,ER Studio.
- Data integration of patient clinical data to provide business users and systems consistent,accurate and timely core business data for Large Retail Pharmacy decision support systems.
- Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture involving OLAP.
- Worked on the Snow-flaking the Dimensions to remove redundancy.
- Analyzed and converted legacy code(cobol and PL/SQL) into Abinitio graphs and artefacts to optimize the performance of existing process.
- Designed Ab Initio ETLs to integrate various sources into EDW.
- Created Abinitio graph templates to be used by the entire team for different modules.
- Concepts like change data capture,data enrichment were implemented in this project.
- Involved in exhaustive documentation for technical phase of the project and training materials for all data management functions.
Technology: Oracle,Abinitio, Unix.
- Performance improvement and optimization of database and already existing graphs.
- Making the existing graphs generic which were being used by specific source systems and applying the same logic to 21 more source systems using the generic approach.
- Developed the required data warehouse model using Star schema for the generalized model.
- Created end to end mapping for the entities,attributes and relationships for new source systems introduced.
Technology :Teradata,Ab initio, Unix.
- Analysing customer requirements.
- Provding cleansed data files to downstream based on requirements.
- Updating relational database based on user requirements and improving performance for the same.
- Determined database structural requirements by analyzing client operations, applications
- Providing solutions for the request and creating low level designs for the solutions.
- Doing impact analysis for the request to find out the systems affected by the change.
- Validating Source data, analysing it, removing the discrepancies if any.
- Creating metadata for Abinitio graphs based on the requirements.
- Testing and bug-fixing of process.
- Extracted data from the databases (Oracle and SQL Server, DB2, FLAT FILES) using Abinitio to load it into a single data warehouse repository.
Skills: UNIX Shell Scripting, PL/SQL,ER Studio V8.0.1, Oracle 9i, 10g, Mainframe.
- Worked on multiple projects with different business units developing processes for NAB partners to identify the customers, categorising them, sending the data to third party (Millennium), and processing files coming from Millennium as per the requirement.
- Developed normalized Logical and Physical database models to design OLAP system for banking & insurance applications using ER Studio.
- Handled the client requirements throughout the SDLC phases till hyper care (post implementation) involving thorough end to end testing of the requests I worked on.
- Preparing Low level design documents and coding checklists for different processes.
- Performance tuning and reviewing of previously created sqls.
- Handled additional responsibility as the CM (Configuration Management) anchor.