Diversified Business Intelligence Engineer with interest in data, distributed systems and machine learning. Experience includes 3+ years in US and rest in India. Exceeded customer expectations through focused research, quantitative, statistical, technical, analytical and problem - solving skills, and an urge to learn and implement processes to improve data-driven decision making. Designed, built and deployed efficient, scalable, robust solutions for intricate business problems for Confidential (Global Business Intelligence) using datawarehousing technologies and reporting tools.
- Data analysis and Integration: Adept in analysis and autovalidation of millions of data using SQL, Python. Ability to translate complex business needs into technical specifications
- Communication skills : Interacting with management and users to gather requirements; estimating effort and defining project scope in a fast-paced agile/scrum environment.
- Detail-Oriented: Conducting root-cause analysis of user tickets and implementing long term solutions to performance and scalability issues leading to product stability
- Team Management : Working with 5-7 member development team to design and build BI solutions. Effectively communicating project plan and organizing triage calls to track backlog
- Domain Knowledge: Product Launch, Supply Demand Planning, Demantra Forecasting, Sales, Finance and Operations, Master Data Management, Retail, Reseller and Education
Business Intelligence Tools: SAP Business Objects 4.1 ( Designer and Web Intelligence), Tableau 10
Data Modeling: Dimensional modeling (Star, Snowflake, SCD Type 1,2) and Relational modeling
Databases: , Teradata 14.10, Oracle 12c,11g, Vertica, MS Access 2003
Programming: C++, Python, LINUX Shell Scripting
Development Tools: SQL Developer, Mac Terminal
Operating System: MAC OS, Windows, UNIX/LINUX
ETL: Informatica, Storm. Teradata Utilities (BTEQ, TPUMP, TPT, FAST LOAD, FAST EXPORT)
P roductivity: Microsoft Office Suite, GIT, SVN, Autosys Scheduler, Waterfall/ Agile Methodology
Project Lead and Solution Consultant
- Confidential tool enables planners to setup products for supply checks, auto-extend quotes if prevailing quotes cannot be supported and also schedule quote changes to take effect at a future date/time.
- Enterprise datawarehouse(EDW) in Teradata interfaces with SAP GUI to determine quote changes.
- There was a lag of 1.5 hours between EDW Open Orders/ Inventory snap and AQE sending the recommendations resulting in multiple excess orders during demand spikes.
- Identified alternate sources where Open Orders and Inventory reaches EDW with minimum latency of 5 mins.
- Also, compared different technology platforms like Vertica for feasible solutions.
- Engaged Infrastructure, Operations teams and management to in corporate changes in ETL framework to capture real-time snapshots.
- Parallelized execution of independent modules and maintained data consistency in presentation layer by designing new metadata logic.
- Analyzed round-trip times for 2 months and quantified the results using Tableau and SQL query results.
- This improved data freshness and quote accuracy in auto-generated Business Objects reports; eliminated manual work with quote monitoring and adjustment ; improved customer conversion reduced excess costs by enabling quote strategy to be aligned with build strategy.
Technical Solution Consultant
- Automated Regression testing tool is based on the concept of Continuous Integration where every time a developer commits a code it triggers the execution of a set of pre-built test cases to validate the changes and merge small changes during Integration testing.
- Python, Selenium and Protractor technology was used to automate Quality Assurance testing.
- Organized meetings with the developers of the tool and enhanced features like automated test data setup that caters to all functionalities of an application.
- Quantified effort required and analyzed objects in scope and identified test data requirements extending to certain rare scenarios.
- Baselined test data bed for future builds and communicated the purpose of the tool to all developers.
- One-time setup of the process eliminated merge conflicts and duplicated efforts.
- Automated tests run for every build ensured consistent quality and readiness for user testing.
- Availability of documented test data requirements for each of the regression test cases and re-usability of framework for future with low maintenance encouraged adaptability by various business groups.
- Reduced manual efforts by executing all processes end-to-end.
Project Lead and Solution Consultant
- Confidential project involved Teradata 15.10 upgrade and large scale environment and capacity expansions into additional data centers spanning multiple functional groups and 50+ application teams.
- Almost every transactional system feeds data into this EDW, which is then integrated across different dimensions for further consumption by reporting and analytics business community. Teradata provides this MPP solution that consists of systems, storage, network interconnect, OS and database software all packages together.
- Created detailed plan (effort estimation, gathered point of contacts, given heads-up to application teams) and communicated effectively with Teradata Consultants with prompt actions and timely updates.
- Created a auto-validation tool using Python to validate close to 10k tables, views, procedures, macros and access privileges.
- Distributed workload of building new ETLs; creation of new Autosys jobs and verified data flow from source in both old and new environment.
- Used Microsoft Excel vlookup and other features to quantify the progress and shared with client on a daily basis.
- The new Teradata system was operational in 2 weeks with minimal business impact and high performance and scalability.
- Application teams were made aware of database objects no longer used thereby increasing perm space and decommissioning of not-used applications.
- Delivered fully-functional new system with enhanced performance under tight time constraints.
Project Lead and Solution Consultant
- Quickly grasped core functionalities of the most critical application in GBI space when project was in Integration testing phase.
- Created test suites and deep dive into key issues in UI and database. Collaborated with 2 developers, Quality Assurance Engineer and Business Analysts to clear product backlog and improve usability. Learnt new technologies like message queueing, Oracle PL/SQL, UNIX Shell Scripting using nohup to run background processes, asynchronous data flow from one system to another and created Tableau reports to monitor system health.
- Leveraged order generator to test quote push thresholds and gained knowledge about projections using linear regression model to predict time to hit threshold.
- Automated quote push for parent part number if any of constituent parts were unavailable by checking order velocity.
- The previous system required a lot of number crunching in spreadsheets, dependent on data exports that were not real-time.
- Customers were communicated with clearly on when they would receive their Confidential product. Overall timing was reduced to few milliseconds using extensive parallelism ; code optimization techniques and rigorous unit and integration testing and stress testing.
- As part of Informatica decommission project, all inbound interfaces to EDW having source as Oracle were converted to Storm.
- Apache Storm is a free and open-source real-time computation system designed to reliably process unbounded streams of data enabling real-time processing.
- Classified 300+ interfaces based on complexity, source to target mapping and refresh type.
- Automated migration by doing metadata setup in test environment, back-migration of original Informatica workflows from production, loading and validating data between Informatica and Storm. Java automation program parsed Informatica XML file and generated file with Storm parameters and required metadata setup.
- Collaboration and sharing of best practices within team helped in completion of project before time.
- Directly interacted with Product Managers and Technical Support teams and communicated implementation and backout plans.
- The new ETL tool expedited the process to extract data from source in real time to target systems and BI reporting tools like Tableau/ SAP Business Objects as well as reduced the cost of data processing.
Project Lead and Solution Consultant
- Created a unified data platform in Teradata, as a central location of maintenance where a single change is automatically reflected in many disparate SAP Business Objects universes and Essbase cubes.
- Organized recurrent meeting with data publishers and subscribing teams to formulate file type, format, data type, source to target mapping and extensions.
- Ensured security of custom groups created on secret products. Incorporated 8 new dimensions based on which custom groups can be created in MDM UI.
- Revised design and data models as per business needs.
- Granted privileges to downstream applications.
- Delivered a strong new platform with much enhanced features, easy maintenance between crossfunctional business groups and provided consistent data across multiple applications so that analysts spend more time analyzing than gathering data.
Senior Technology Analyst
- Introduced new UB measures and associated dimensions in Finance reporting tools.
- Engaged team to brainstorm about reducing latency in publishing UB measures to subscribers.
- Developed, tested and deployed revised code by using Teradata Analytical functions such as RANK, ROW NUM; refresh frequency improved from bi-hourly to every 30 mins.
- Later created a separate module to report pure UB measures in Product Launch Dashboard by building data pipelines to be consumed by SAP HANA using Confidential proprietary tools. Prioritized data integrity and performance. Scheduled reports directly sent to senior executives daily to display actual sales.
- Conducted playback sessions and resolved issues/ conflicts. Designed compatibility of measures with existing dimensions used in 20+ applications.
- End user demand was accurately calculated for countries with a high number of non-managed locations.
- Unbrickings measure is used widely by worldwide Demand Forecasting and Analytics team to monitor demand and performance; more reliable as UB is the single source of truth.
- Designed and developed new aggregates in Enterprise data warehouse (Teradata) and created new key metrics and dimensions in Business Objects universes.
- Developed a forecast system with a range of twelve quarters that will report historical actual sales data as well as forecast data for upcoming weeks. Built a re-usable metadata framework to refresh certain data segments on a custom frequency that was easy to maintain and flexible.
- Towards the end of a week the five stage forecasting model finalized forecast for that week driving key business decisions and reported to executives of various business groups.
- Exhaustive testing of all scenarios, automated data validations and SLA check using UNIX Shell scripting, and user training after every release extended its usage to different Finance areas and stakeholders.
- With its ability to schedule and distribute reports, GRID (Business Objects) on Forecast metrics meets the need for operational reporting in Confidential Finance.
- Detailed actuals reporting on Bookings, Billings, Backlog (BBB) as well as revenue and Confidential retail store measures on the new hierarchies managed by business users.
- Enables reporting on Sell-through units & dollarization, reseller inventory, unbrickings and total sales forecast from Demantra.
Senior System Engineer
- Involved writing tuned queries in Teradata and developing ETL interfaces using Informatica to map the data from source to reporting layer.
- Analyzing the root cause of performance degradation of the reports. Conduct analysis and suggest/ implement workarounds for the critical reports, which help the business users in taking better decisions.
- Implementation of relevant Software Development Life Cycle (SDLC) activities as required in development, maintenance, reengineering of the existing business object reports. Conducting unit tests and documenting unit test results, responding to production issues and implementing the solution.
- Validation activities for Business Object reports by writing relevant queries and validating against the available data (functional, integration, system, user experience testing).
- Cost reduction through optimal solutions using Teradata best practices. Improved response time of 30 BO reports, improved by 40%.
- Accurate and faster access to data for timely decision making Created a knowledge base documenting the lessons learnt for user training in future.