- Experience with the BI tools Power BI Desktop, Power BI Service, Tableau Desktop, Tableau Server, Cognos, Cognos TM1, Tibco Spotfire.
- An IT professional with 11+ years of experience with System Analysis, Estimations, Planning, Coding, Testing and Postproduction Support, in the field of Data warehousing, Data Modeling, Business Intelligence, Decision Support Systems, Customer OLTP/OLAP (Star & Snowflake schemas), Reports Generation and Web development.
- Expert in creating and developing Power BI Dashboards into rich look, used Power BI Gateways to keep the dashboards and reports up to date.
- Developed Tableau visualizations and dashboards using Tableau Desktop.
- Produced attractive Tableau visuals/dashboards to convey the story inside the data.
- Published Workbooks by creating user filters so that only appropriate teams can view it.
- Experienced at designing and development of data warehousing reports using COGNOS (Report Studio, Query Studio, Analysis studio, Event studio, Active Reports, workspace, workspace advanced and OLAP cubes).
- Proficient in developing Business models using Framework Manager (Relational & DMR) and published packages to Cognos Connection in Cognos depending on business requirements.
- Experienced in Framework Manager Meta Data Modeling (Database Layer, Logical Layer, Presentation Layer, CQM & DQM Packages and Relational & DMR Modeling).
- Involved in migration of Tableau Dashboards/Cognos packages and reports from Development to Stage, Stage to Production, administering users/groups/roles, Scheduling Jobs, Event Creations, folders and assign permissions.
- Involved in Cognos upgrade from Cognos 8.2 to Cognos 8.4 to Cognos 10.x.
- Involved in creation of Data Source connections, Security and Experience in Deployment (Export, Import) from one environment to another environment.
- Implemented Security at different levels in Framework Manager and Cognos Connection to restrict access.
- Have diverse experience in Tableau Dashboards & Cognos Report Testing, Functional Testing, Integration/System Testing and Quality Assurance.
- Good experience in Business Objects reports migration to Cognos and Cognos reports migration to Spotfire.
- Experience in creating Complex Spotfire Dashboards or Reports Using Tibco Spotfire Professional.
- Created Complex Information Links to pull the data from Oracle 10g/11g and Denodo.
- Experienced in Financial Planning Solutions using Cognos TM1 10.2 Applications (Architect, Contributor, Perspectives and Turbo Integrator) from requirements through design, testing, implementation and production support.
- Strong experience in Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses and Data Marts using Informatica Power Center (Repository Manager, Designer, Workflow Manager, Workflow Monitor, Metadata Manger), Power Exchange, Power Connect as ETL tool on Oracle, DB2 and SQL Server Databases.
- Having good knowledge on Agile Methodology.
- Have strong oral, written, organization skills, excellent time and project management skills.
- Carries a positive attitude, enthusiastic and a firm believer of teamwork.
- Mentored and trained users for the successful execution of the project.
- Involved in triage and support in 24/7.
OLAP/BI: Power BI 2019 (Power BI Desktop, Power BI Service)Tableau 2019.x/10.x (Tableau Desktop, Tableau Server), COGNOS 11.x/10.x/8.x (Report Studio, Query Studio, Analysis studio, Event studio, Workspace, Workspace Advanced, Active Reports, Transformer (OLAP Cubes) & Framework Manager), Tibco Spotfire 6.5 (Professional and Web player), Cognos TM1 10.2 (TM1 Architect, TM1 Perspective, TM1 Web and TM1 Contributor), SAS Enterprise Guide, SAS Studio
Databases: Oracle 9i & 10g, MS SQL, Server 2005 & 2008, MS Access, MySQL, Teradata, Snowflake, SAS
Operating Systems: Windows NT/95/98/XP/Vista/7, Linux, Solaris
Confidential, Hillsboro, OR
BI Consultant/Analytics/Data Engineer
- Perform major role in gathering/understanding the business requirements from Confidential Demand Planners and designing/building an analytics solution using star/snowflake schema in data warehouse.
- Complete data analysis using Teradata, Oracle with respect to reporting requirements.
- Extracted data from multiple data sources (ECC, SAS data sets, Oracle) into Teradata warehouse.
- Creating, validating, Testing and Running the sequential and batches and Sessions, and scheduled them to run at a specified time using tools such as Informatica Workflow Manager
- Database development in MySQL using Workbench 6.2 client, High proficiency in MySQL (ability to write and optimize complex queries).
- Responsible for developing, support and maintenance of the ETL (Extract, Transform and Load) processes using tools such as Informatica Power center
- Developed Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
- Developed scripts for loading the data into the base tables to load the data from source to staging and staging area to target tables using Fast Load, Multiload and BTEQ utilities of Teradata.
- Design & Enhance Power BI analytics solution with visualizations and Cognos reports which enable the Demand Planners to view perspectives of the data.
- Generated ad - hoc reports in Excel Power Pivot and shared them using Power BI to the decision makers for strategic planning.
- Utilized Power BI (Power View) to create various analytical dashboards that depicts critical KPIs such as legal case matter, billing hours and case proceedings along with slicers and dicers enabling end-user to make filters.
- Expertise in writing complex DAX functions in Power BI and Power Pivot.
- Automated Power Query refresh using power shell script and windows task scheduler.
- Used various sources to pull data into Power BI such as Teradata, Oracle, Snowflake, Aurora MySQL etc
- Created Workspace and content packs for business users to view the developed reports.
- Scheduled Automatic refresh and scheduling refresh in power bi service.
- Wrote calculated columns, Measures query’s in power bi desktop to show good data analysis techniques.
- Worked on all kind of reports such as Yearly, Quarterly, Monthly, and Daily.
- Worked on all types of transformations that are available in Power bi query editor.
- Design & Enhance Cognos Framework Manger models (Relational & DMR) based on the business requirements.
- Migrate the Cognos/Power BI reports/dashboards from DEV to QA to PROD using Service Now.
- Monitor & schedule jobs and reports in Production to make sure reports deliver successfully to Demand Planners.
- Involve in reporting solution upgrade from Cognos 10.0.6 to Cognos 11.0.6.
- Worked effectively on Index Tuning Wizard, Estimated Query Plan to optimize the performance tuning of SQL Queries.
- Developed python framework to automate processes in Teradata.
- Developed shell scripts in Linux and scheduled cronjobs.
- Co-ordinate Quality Assurance System Testing, System Integration Testing and User Acceptance Testing for all new Report Developments.
Confidential, Woodlands, TX
Tableau Consultant/Data Engineer
- Responsibility to build an analytics solution as per requirements from the business.
- Responsible in identifying data for analytics solution.
- Making strategic recommendations on Data collection, integration and retention requirements incorporating business requirements and knowledge of best practices.
- Created various complex visualization dashboards using Cross tables, Bar Charts, Line Charts, Scatter plots and more.
- Building, publishing customized interactive reports and dashboards, report scheduling using Tableau Server, Involved in Performance Tuning of Tableau Reports.
- Created Extract filter, Data source filter and Context filter to improve performance.
- Extensive experience in using all Level of Detail (LOD) expressions like Include, Fixed and Exclude to create visualizations that are different than the view level or visualizations level of detail.
- Developed story telling dashboards in Tableau Desktop and published them on to Tableau Server which allowed end users to understand the data on the fly with the usage of quick filters for on demand needed information.
- Developed Tableau workbooks from multiple data sources using Data Blending.
- Used SAS Import/Export Wizard as well as SAS Programming techniques to extract data from Excel.
- Used Proc SQL and SAS EG Guide in extracting huge SAS data sets.
- Responsible for solving data-related issues and communicating resolutions with R&D and data sources.
- Data analysis, data modeling, and data mapping capabilities.
- Discovery and removal or correction of errors in data records though data cleansing activities.
- Accountable for the assessment, delivery, quality, accuracy, and tracking of any production data fixes.
Environment: Tableau 10.2 (Tableau Desktop and Tableau Server), MS SQL Server, SQL, Python, SAS 9.4, SAS Enterprise Guide, SAS Studio, Windows, Linux, Service Now.
Confidential, Pittsburgh, PA
BI Team Lead
- Responsibility to build a reporting solution per requirement from customers.
- Involved in gathering the requirements and took it from white board to implement reports/dashboards using Tableau/Cognos.
- Effectively used data blending feature in tableau to get data from DB2, MS SQL Server.
- Converted few Cognos Reports to Tableau Dashboards for Visualizations.
- Created Tableau scorecards, dashboards using stack bars, bar graphs, scattered plots, geographical maps, Gantt charts using show me functionality
- Created Custom Hierarchies to meet Business requirement in Tableau
- Created action filters, parameters and calculated sets for preparing dashboards and worksheets in Tableau.
- Publishing customized interactive reports and dashboards, report scheduling using Tableau server.
- Executed and tested required queries and data before publishing Dashboard.
- Created appropriate models in FM and created reports out of it using Report studio.
- Response in identifying data for decisional BI reporting
- Response in creating metadata from different data sources
- Response in delivering complete reporting application to business
- Involved in complete data analysis in terms of reporting requirement perspective.