Bi / Db Architect / Lead Consultant Resume
CaliforniA
SUMMARY
- Twenty plus years of Business Intelligence IT experience, which includes Management, Consultant and Architect roles.
- Experience in all phases of SDLC from Bits and Bytes (Z80) through Big Data including web - based technologies.
- Known to work miracles by taking on complex problems and turning them into intuitive / elegant solutions.
- Considered as Guru in Performance Tuning by solving problems with experience instead of established patterns.
- Extensive experience in translating business needs into Data Warehouse, BI, and Data Architecture solutions.
- Technical proficiency with Hands-on to day experience in design, implementation, and coding across full life-cycle of Data Warehouse, Business Intelligence, OLTP, and OLAP environments.
- Very good noledge on Big Data technologies; Hadoop, MongoDB, Vertica, Spark, NoSQL, Schema-on-read / schema-less and column-oriented databases.
- Extensive experience in selecting and pushing teh transformations to teh best server (DB / ELT / ETL) available in teh environment, especially when it involves Large volume transformations such as Lookup, Aggregator, Sorter, Filters required in BI Analytics.
- International experience in different business application SDLC: Manufacturing, Consulting, Computer, Finance, Educational, Semi-conductor (Supply Chain) and Automotive (sales and rental) organizations.
- Known to has brought together teh development team into an efficient and well-coordinated unit to meet teh full breadth of user needs.
- Extensive experience in ETL/ELT tools; Data Mapping, Performance Tuning of Sources, Targets, Mappings / Transformations and jobs / sessions. Also, very good noledge on teh caching technologies, which allows me to select teh best mapping/transformation required for teh process.
- Excellent communication, client interaction, onsite-offshore coordination and problem solving skills.
- Experience in Installation and Configuration of Informatica, Talend, and Hadoop / Vertica (multi node cluster).
TECHNICAL SKILLS
Databases: Oracle / Teradata / MongoDB / MS SQL Server / MySQL / Vertica / Couchbase / Cassandra
Languages: Python / Perl / Node.js / Shell Scripting
Platforms: UNIX / Windows / Linux
ETL Tools: Informatica Power Center / Talend
Others: OLAP / Materialized Views / Summary Management / Big Data / Spark
PROFESSIONAL EXPERIENCE
Confidential, California
BI / DB Architect / Lead Consultant
Responsibilities:
- Developed Incorta Utility to export teh tenant XML and extract teh attributes for Dependency Search & Memory Management.
- Developed workaround pyspark code for Incremental Loads in Incorta pyspark for Materialized Views.
- Tuned a SQL in MDM, which was running around 2 hours came down to 6 mins.
- Within a week of joining in Employee Systems Group, detected a long running pyspark function and suggested performance improvement for a pyspark function, which was used in around 20 Incorta MV jobs.
- Developed pyspark MV jobs which are used by Incorta Dashboard for Apple Employee System
- Developed complex Vertica SQL views with Analytics to support front-end Microstrategy Dashboards/Graphs.
- ETL of CentralStation Ticketing System Logs and Construction Performance to Vertica through Perl/Python.
- Provided MongoDB POC for migrating Data Center Asset Management System from SQL Server Database.
- Performance Tuning Expert in Global CRM (GCRM) Sales Team, responsible for teh DB solutions, performance tuning and / Interfaces in/out of GCRM.
- Client Appreciation for providing excellent Data Security solution through Oracle Text Search (Catalog / Context) for frequent territory changes in Note Security Application.
- Appreciation for Suggesting Index Organized Table solutions for couple of Sugar CRM Data Security tables.
- Multiple teams look forward to my expertise on complex SQL problems.
- Long running SQLs were re-written which drastically improved teh performance.
- Business Users appreciation for significantly reducing wait time on teh front-end UI, which was done through a minor back-end materialized view query re-write enable option, where teh project cost was absolutely zero.
- Provided POCs for Data Migration through Talend from existing GCRM to Sugar CRM; Huge Data Load for Vertica; migration of existing SEA and Prices systems to new Data Base environments.
- Improved teh Query Performance from multi seconds to milliseconds for Vertica project by creating pre-join projections on fact and dimension tables.
- Currently working on Incorta Schema/Dashboards for MDM, CRM and Nova Systems.
- Worked on pyspark to implement transformations, which are required for Incorta to handle teh Employee System Dashboards.
- Developed a utility to convert XL to flat file, which was used across all Vertica ETL projects.
- Developed multiple ETL scripts for Vertica ETL projects in Perl and Python.
- MongoDB POC for Power Strip Automation, which requires, secured, encrypted and High-Availability.
- Worked on Perl ETL scripts to migrate data from SQL Server Database to MongoDB hierarchical data model.
- Worked on Initial Data Migration through Talend, deploying teh jobs on Apache server. Successfully implemented multithread process through Perl scripts using teh fork process.
- Developed Incremental Data Load jobs through Talend REST components to utilize teh Sugar CRM APIs, which will incorporate elastic search.
- Worked as Project Lead & Onsite coordinator for WISR (Worldwide iPhone Sales Reporting) in major quarterly core and monthly enhancement releases to streamline teh Business Objects Reporting, iPhone Sales Portal dashboard and Roam BI applications.
- Consolidated, Standardized and Modularized Teradata Procedures.
Environment: Incorta, Spark, Vertica, MongoDB, Teradata, Oracle, OLTP, OLAP, ETL, Informatica, Talend, Python and Perl.
Confidential, California
ETL/Data Warehouse Team Lead
Responsibilities:
- Increased teh Production Support Customer Ticket closure rate to 98% from 68% within six-month period by overseeing day-to-day operations of data warehouse and ensuring dat teh customer issues/tickets are resolved in a timely manner. Received Star of teh month Award during April 2010.
- Improved customer relationships by building rapport, delivering on-time and quality work and looking ahead to put long term fixes to bring stability to teh data warehouse environment.
- Resolved data quality issues from legacy source systems by analyzing/providing error log trends.
- Designed and developed a dashboard system, using SSIS and SSRS, to see measures like time & mileage amounts, counts and drill down data for daily trend in Rental/Reservation flow from legacy systems / SaaS to downstream applications through Data Warehouse and Data Marts.
- Trained teh team on new Enterprise Data Warehouse developed under SQL Server Business Intelligence.
- Worked as lead/member in resolving long pending customer issues from Data Management and Down Stream user issues.
- Coordinated with offshore development team (EDS/MphasiS, HP subsidiary companies).
- Coordinated with other departments using teh downstream applications from Data warehouse/Data Marts, to improve teh quality reporting.
- Improved teh productivity of teh team by providing technical expertise and guidance in teh skills required for Production Support.
- Changed teh email communication of issues into Requirement Document/Approval process for each ticket to ensure dat teh issue is resolved in right direction.
Environment: Oracle, SQL Server, OLTP, OLAP, Informatica, SSIS, SSRS, SSAS and Perl.
Confidential, California
Solutions Architect
Responsibilities:
- Reduced teh pager support by 90% in a six-month period by participating in developing Dallas Semiconductors ETL to i2 supply chain software and merged into MAXIM’s job stream.
- Increased sales by providing quality-oriented results to supply/demand change requests to Supply Chain and Factory Planner software in a time-driven environment.
- Increased teh Job Monitor workflow efficiency by identifying and developing parallel processing and event-based triggers, resulting in timely reporting of pegging reports across factories around teh globe.
- Worked as data team member in upgrading Dallas Semiconductors ETL to i2 supply chain software.
- Provided on-going software maintenance to Supply Chain Planner and Factory Planner software.
- Worked in Business Object Universe development and maintenance.
Environment: Oracle, OLTP, OLAP, ETL, Informatica and Business Objects, Erwin.
Confidential, California
Applications Architect
Responsibilities:
- Eliminated teh manual FTP and upload process by implementing automated process of Customer Hierarchy Replication between different databases.
- Increased Online (Internet) Assessment System sales by producing student results within weeks by managing off-shore team in development of project interfacing different manual systems, databases and mainframe.
- Provided development direction and coding support to staff developing Scoring Analytical Reporting database, data mart & aggregation/extraction processing by introducing Materialized Views and OLAP which resulted in producing statewide reports for K-12 paper assessment test within weeks.
- Designed and Developed template based Business Intelligence solution to create a User Defined Field for multiple customers, which resulted in eliminating months of custom programming for each customer.
- Analyzed, Designed and Developed ETL of Customer Hierarchy Replication system between Monarch (Oracle), AMS/OE (Blue Martini) and Compass (Oracle) systems used by different McGraw-Hill divisions. FTP between Blue Martini and Oracle was coded through Perl using sftp to remote login and initiate Blue martini process.
- Involved in ETL Development with Wipro Consultants for Online Assessment System to interface Oracle database and IBM mainframe CA Datacom.
- Assisted Software Requirement Analysts to develop UAT cases for teh above Informatica projects
- Participated in teh design of Scoring Analytical Reporting DB, data mart & aggregation/extraction processing using materialized views.
- Scoring Analytical Reporting for Kentucky and Connecticut programs. Modified/Customized data mart loading packages. Involved in development of ISIS report extract packages. Used Perl scripts to breakdown huge files into divisions for ISIS reporting, which had limitation on file size.
- Introduced Materialized Views and OLAP functions in aggregation / extraction processing of Connecticut.
- Provided direction and guidance to BI/ETL development and support for teh student test assessment Data Marts.