- Seeking a challenging opportunity to utilize my expertise in Big Data and Analytics using Business Intelligence Development Tools to assist companies to integrate the systems and optimize the business ETL processes and Data Quality.
- Experienced in user/customer facing, planning, executing and designing, developing and deploying enterprise wide analytical and Business intelligence integrated dashboards with a strong focus on statistical analytics, Geo - analytics, systems integration, data warehousing, data profiling, cleansing & optimizing user experience, cross-platform application development and complex interactive visualizations to meet the business trends. 5+ year’s hands on experience using TIBCO Spotfire Enterprise on data governance and AML Projects.
- Experienced in writing SQL’s Codes (MS SQL, DB2 SQL, Netezza SQL, Hive QL, Big SQL and PL/SQL) to analyze the data for analytics. Created stored procedures for the custom functions in Spotfire along with python scripting for big data predictions and analysis. Hands on experience in administering, Deploying, and maintaining models analyzing big data using Power BI, Cognos, Qlik View, Spotfire, Tableau, Business Objects, Alteryx. Worked on Qlik Tech Enterprise for Qlik View and Qlik Sense and BO dashboards developments as part of transition projects from one environment to the other. Enhanced the existing tableau and BO dashboards to meet the latest updated trends and visuals is the key expertise.
- Customized security and data connectivity hierarchical structure for the users to work on uniform permission and roles for the projects. Created complex python codes to read Active directory LDAP and windows roles for row level security implementation at the dashboard and the data level along with personalized dashboards for the business. Created complex statistical and financial analytical models, trend over time statistical visuals, ranking and forecast reports, Executive Dashboards using Tibco and Qlik Tech line of enterprise products.
- Hands on experience in SDLC documentation, End User training, installation and configuring the server for the Spotfire users and as well as the load balancers. Experienced in 24/7-production support and troubleshooting. Authored system-build guides for the resources to enable configurations, SSO, Safe Synchronization, Additional domains, Deployment of third-party tool in collaboration with TIBCO as well as Qlik Tech.
BI Analytical Tools: TIBCO Spotfire Enterprise 6.0 until 7.7, SQL Management Studio (SSIS, SSRS, SSAS)Qlik Tech: Qlik View, Qlik Sense, Tableau, Power BI Professional, Business Objects Analysis, Power BI, Datameer, IBM Aginity H, IBM Aginity workbench, MicroStrategy Reporting, Steer wise Data Symantec Analyzer, AXON.
Database: MongoDB, Oracle, SQL Server 2012, DB2, SAS Data sets, Teradata and Flat filesLanguages: SQL, PL/SQL, Hive QL, DB2 SQL, Big SQL, Netezza SQL, Iron-Python.
IDE/Text Editors: Eclipse, Visual Studio Code, SQL Developer and WebStorm.
Version Control Systems: GIT, Bitbucket and SVN.
Operating Systems: Windows, Linux, and Mac OS.
Other Tools: Informatica Power Center, Informatica Data Quality Developer and Analyst, Bitbucket, TOAD, SVN, Microsoft Visual Studio 2012, Microsoft Visual Source Safe, IIS, Web Services, Visio, PowerBuilder, Team Foundation Server, Microsoft SharePoint 2007, Autosys
- Developed AI powered data catalog, data lineages and Enterprise Glossary using various EDC tools such as Steer wise Data Symantec Analyzer and AXON for Management of Enterprise Glossary.
- Analyze various data systems and flow pushed and pulled into Data lake using various databases and scripting.
- Created Data Quality Rules definition document and supported MFT Jobs for various applications like AXCIOM, CROWE, Wallstreet Suite, and other data sources.
- Consolidation of reports based on various factors and eliminating duplicates.
- Creation of Source to Source mapping for various existing systems.
- Data profiling using Informatica Analyst tool and preparing data quality rules for the processes.
- Creation of mapping, workflow and application using Informatica Developer and scheduling the processes using Autosys.
- Collect data issues across the meta data ecosystem, work with business owners and data stewards to remediate issues.
- Create a highly structured and comprehensive control environment to assure that content within the catalog is remains in high quality.
Confidential, Chicago, IL
- Responsibilities are identifying, sourcing, normalizing, Data Cleansing, aggregating, and running a series of analytics against bank data, including client demographic, geographic and transactional data by user facing and planning with large treasury and other financial departments and executing the project.
- Created Data Sources/ Universe by using Information Designer to connect various data marts by mashing up the data from SQL Server/SQL Server Data Direct, SAS/Share functions, Hadoop, Netezza and DB2.Ran enterprise level scale of statistical and analytical models to explore the data for the business needs. Worked as an Admin for (SQL server, Netezza, Hadoop, Datameer)/Spotfire/Qlik View/ BO/ Tableau. Converted all the dashboards hosted on different BI Environments like tableau and BO to Spotfire compatible. Enhanced existing Tableau dashboards to meet the advanced logics and business requirements and manipulating analyzed large, complex, and multi- dimensional data with large variety of analytical tools. Analyze data quality of the data used in critical reports like Fed Stress Test, QRM (Qualitative Risk Management) and AML using Informatica Data Quality Analyst and prepare Heat Maps on the data quality.
- Customized security and data connectivity hierarchical structure for the users and group level security standards for Spotfire at the bank for P & C US Data and Analytics Owned environment. Created Information links across projects for combination of most common elements and used in various reports to minimize development effort using Python IDE. Involved in developing new universes as per the user requirements by identifying the required tables from Data mart and by defining the universe connections.
- Designed and developed Sales dashboards with different visualizations for marketing team like Bar Charts, Combination charts, Pareto charts, Bullet graphs, Heat maps,KPI, Cross Table, and line charts.
- Created information links, data elements using information designer and used custom SQL in complex information links.
- Used property control, Action control within the score cards and scheduled through automation services for embedding the data.
- Dashboards are accessing data from various sources Excel file son share point, CSV on share drives and SAS data sets from SAS data source.
- Applied calculated columns and custom expressions within report to get informative data according to business requirements.
- Extensively used Library Administrator to create folders for all developers and assign proper security within spot fire.
- Created performance scorecards using combination charts, summery table, and parallel coordinate plot.
- Created hyper link reports to make drill down reports for end users.
- Part of Spot fire version upgrade from 3.2.1 to 4.0.2, evaluated and resolved all impacted report issues.
- Provided 24/7 supports as part of on call support for production.
Environment: Spotfire professionals 6.0, Spotfire web player, Spotfire enterprise player, SQL Server 2008, SQL.