Azure Big Data Analytics Architect/developer Resume
SUMMARY
- IT experience spanned over 20 years of experience with special focus on Business Intelligence, Data Analysis, Data Mining, BI Reporting, Analytics, Meta Data Management, Data Quality, Data Discovery, Data Visualization, Predictive and Prescriptive Analytics, Data warehouse, Big Data Lake, Dimensional Modeling and Management Dashboards related technical and management areas.
- Migrating database to Azure data Lake, Azure data lake Analytics, Azure SQL Database, Data Bricks and Azure SQL Data warehouse and Controlling and granting database access and Migrating On premise databases to Azure Data lake store using Azure Data factory.
- Comprehensive understanding of BI platforms and project execution that includes Analysis of Alternatives (AOA) study, tool selection, architecture, design, development, optimization, automation, user authorization/entitlements, administration and support of many enterprise - wide complex BI/DW initiatives across global organizations.
- Certified Project Management Professional (PMP) from PMI. Program/Project management experience in leading cross-functional teams in the design and launch of cutting-edge solutions for complex, mission critical technology initiatives, process enhancements, and defining industry best practices and standards.
- Excellent verbal and written communication skills to keep stake holders up to date with technological and delivery risk and their mitigation plan.
- Excellent ability to motivate and empower high-performing and diverse teams with the capacity to make solutions-oriented and creative contributions in highly demanding situations.
- Extensively used Agile and SDLC processes for the project life cycles phases. Involved with Idea Inception, Design Analysis, Design/Architecture of Enterprise Business Intelligence and Reporting System, estimation, planning and successful execution of projects.
- Identifying and proposing short term and long term BI/DW objectives and plan for their implementations.
- Leading multi-disciplinary globally diverse technology team and managing project resources and deliverables.
- Extensive experience with N-Tier architecture and related technologies using front end (UI), middle tier and back end database architecture.
- Good understanding of regulatory aspects and requirements of Investment banking, trading front end and back end processes, Basel I, Basel II, BASEL III, SOX requirements, objectives and building compliant systems.
- Implementation of Big Data technologies and Hadoop enterprise big data solution landscape in a cloud environment.
- Expertise in system integration, security integration and application integration. Executed many upgrade and re-engineering projects.
TECHNICAL SKILLS
BI Tools: Qlikview, Qliksense, Tableau,Spotfire, Jasper, BIRT, Business Objects, Micro Strategy, Cognos, R, SAS, Azure, N-Printing, Crystal Reports, SSIS, SSAS, SSRS, Machine learning, NLP
ETL Tools: Informatica Power Center/Data Quality/MDM, Ab Initio
Databases/DB Tools: Oracle, DB2, MS SQL Server, Netezza, Hyperion, Essbase, Sybase, MS Access, PL/SQL, DB Artisan, UDB, TOAD, Rapid SQL, snowflake, databricks
Operating Systems: Windows NT, Windows 2003, Windows 2008, Linux, UNIX, Solaris
Other: Java, SalesForce, Cloudera, Hadoop, HDFS, Mongo DB, Hive, Pig, Map-Reduce,, data lake, Big Data analytics, Impala, ASP,JSP, Tomcat, SOAP, Azure Cloud, ADF, Siebel, Visual Basic, Jboss Portal, MS Office, Shell Scripting, Perl, SVN, Share point, XML, Schedulers - Tidal/Autosys, C++, HTML, Visual Basic 5.0, Java Script, LDAP, VB script, Murex, Apache Thrift, scala, AWS Cloud, Cosmos DB, AWS S3, AWS Lamdba, Redshift, Jupyter, Machine Learning, NLP, beeline, sqoop, spark, pyspark, kubernates, spark sql, Graph SQL, CLIPS, ontology, protégé
PROFESSIONAL EXPERIENCE
Confidential
Azure Big data Analytics Architect/Developer
Responsibilities:
- Worked on data analysis, Data Quality, Data governance, data modeling for product data.
- Designed Azure Databricks serverless, clusters, shared workspaces notebooks for various processing activities in the pipelines and maintaining job schedules.
- Worked on designing ADF pipelines, pre-processing and data ingestion process for MSCI ESG data in to Azure Synapse SQL DW using Azure Databricks.
- Monitoring databricks clusters and optimizing for various IO and compute performance requirements.
- Worked on process for generating ESG ratings and standardise it for companies.
- Worked on generating portfolio ESG rating calculations and perform impact analysis and comparison with Index and manage client target ratings.
- Worked on setting up Azure Data Lake environment for unstructured data and log files from multiple applications.
- Designed Horizon pipelines using spark engine for storing and delivering equities and options daily prices and greeks.
- Used azure data migration process to lift and shift securities risk master database to azure.
- Responsibilities include working on azure datalake for capturing different sets of data and process for automation.
- Worked with regulatory and compliance groups to capture their requirements for data reporting requirements for various data reporting requirements for downstream systems.
- Setup data cleaning process and generate quality data for data lake ingestion and Predictive Analytics using Databricks MLLib.
- Perform data profiling to ensure clarity and quality of the data.
- Created dashboard visualization objects for sharing purpose with reuse across different dashboards.
- Create and distribute dashboard for transactions, portfolio risk, assets and performance analysis.
- Created Azure REST API endpoints to deliver the performance and client reporting data feeds
- Developed azure app service for delivering UI components for use in dashboards and presentation.
- Created Market Abuse/DOI 13F/G, employee trading dashboard for pre-trade and post-trade compliance.
- Integrated with snowflake for snowpipe and share market data with other departments globally.
- Setting up Snowflake internal and external stages and model for data ingestion process.
- Setting up change data capture pipelines through snowpipe and azure event hubs.
- Implemented hierarchical roles for database access and management process in snowflake.
- Delivered static and reactive dashboards to support drill downs and actionable dashboard components.
- Worked on creating SOD/EOD data source automation processes.
- Generated regression Machine Learning predictive model for identification of trade audit default cases.
- Worked on creating process for audit and trade workflow for third party systems within pipelines for output.
- Built python process to produce Tableau tde files for dashboard consumption to improve performance.
- Writing various SQLs and performance tuning them for use into different dashboards for other data extraction processes.
Confidential
Big data Analytics Azure Architect
Responsibilities:
- Responsibilities include designing datalake on Azure infrastructure for the application for performance optimized environment.
- Worked on data analysis, Data Quality, Data governance, data modeling for business use cases using Erwin.
- Designing sqoop infrastructure using python support data ingestion from SQLServer and Oracle data sources.
- Designing and setup of HBASE for data storage for analytics purpose.
- Designing and building Hive tables and views for aggregated and detail level representation.
- Creation of process for data profiling used in datamodel design and for performance optimization purpose using python.
- Confirgured kafka messaging queues for integration for trigger streaming jobs.
- Setup docker images and yeml script for the application cluster on kubernates.
- Creation of link tables external tables in hive for providing single source for all analytic data.
- Designing process for various Claims and Prior Authorization workflows for automation using rule based engine.
- Developing multiple data reconciliation process using ADF and automation process using cron jobs and sqoop jobs.
- Worked with sparksql on the graph database (RocksDB) and Redshift for creating automation, data enrichment and export process using pyspark.
- Creating and managing various source-based system centric ontologies and merged ontologies for MDM.
- Building azure bigdata lake for analytics in graph DB using Graph SQL integrating claims,billing, prescription, pharmacy,drug and adjudication data.
- Design and deliver SelfService BI Solution using spark, and ML NLP algorithm on Bigdata platform.
- Generate billing and claims Analytics dashboards in Tableau connection.
- Create data visualization using geo-spacial data on maps dashboard for Claims and billing scorecard.
- Designing stories/dashboard for senior management for KPIs., Design and development of the data exploration, preprocessing and model generation for Predictive analytics using Jupyter Lab.
Confidential
BI/DW Architect/Lead
Responsibilities:
- Worked with CXOs to discuss the BI and data analytics strategy establishing short term goals and long-term visions.
- Perform weekly review of risks and timelines and take necessary steps proactively and keep senior management informed of the risks and corrective actions.
- Worked with corporate technology teams to architect the BI and predictive solution meeting technical and business standards. Data Ingestion to one or more Azure Services - (Azure Data Lake, Azure Storage, Azure SQL, Azure DW) and processing the data in In Azure Databricks.
- Worked with data sourcing team to understand the implementation and designing the data integration using SSIS and process automation to meet the scheduled data availability from SSAS.
- Worked on defining Data Quality, Data governance process and Data Lineage, data procurement strategy and building an automated process to support it.
- Closely worked with end clients for identifying key matrices and fact analysis for global finance analytic solution.
- Worked with cross functional database, ETL, web support team to ensure co-ordination and co-operation for infrastructure and technology support.
- Identifying operational gaps, defining process follow-ups and controls to ensure a quality product delivery and minimizing financial, resource and schedule risks.
- Design and implement n-tier security architecture and data models, datamarts architecture sharable design framework..
- Initiated and managed global data quality initiative for building global datawarehouse providing 360 degree view of the clean corporate data.
- Evaluate and analyze data and automate integration with data sources for data warehouse processes.
- Design and implement BI Application security and access rules for the applications.
- Created mashups for Salesforce Integration providing KPI analysis interface with Qliksense dashboard.
- Create GIS Map based visualization dashboard mashups for distribution analysis.
- Development of highly interactive visualization dashboard with fully automated process
- Design and delivery of the Qlikview datamodel for use in BI management dashboards.
- Setting up hadoop infrastructure for evaluation and plan for enterprise data lake with both structured and unstructured data ingestion from multiple sources.
- Worked on setting up the environment and plan for dashboard migration from Qlikview to Tableau.
- Setting up unstructured data storage on AWS S3 for various external feeds.
- Setup data exploration, preprocessing and pipeline for train/test model data preparation for ML proccess.
- Prepared a shared ML data framework for model and data sharing for Key Predictive ML use cases
- Created recommendation model using K-Means ML algorithm for product recommendation for clients.
- Designing sqoop infrastructure to support data ingestion from multiple sources including oracle, sql server, log files.
- Designing and setup of HBASE for big data storage and Hive queries in sqoop to analyze and export the data.
- Setup automation process using cron jobs and sqoop jobs and execution of end to end data ingestion process.
Confidential
BI Architect/Tech Lead/SME
Responsibilities:
- Worked with business users in identifying technology needs and defining technology components for an optimized highly available and scalable architecture solution.
- Worked on defining data capture strategy, transformation and integration of multiple data sources for datawarehouse.
- Worked with end clients for identifying key matrices and fact analysis for global expense analytic solution.
- Lead onsite and offsite teams for application development and support activities.
- Showcasing innovative data visualization for enterprise users to get concise, effective active data representation.
- Worked with cross functional database, ETL, support team to ensure co-ordination and co-operation for infrastructure and technology support.
- Identifying operational gaps, defining process follow-ups and controls to ensure a quality product delivery and minimizing financial, resource and schedule risks.
- Delivered multidimensional data model in Qlikview to integrate diverse data from various expense categories.
- Worked on the setting up automation process to ensure the smooth execution of the process.
- Implementing custom user defined data entitlement security using region, product and countries
- Design and delivery of the Qlikview datamodel for use in BI management dashboards.
- Evaluated BI tools for self-service data discovery application for better analysis of data and its presentation.
Confidential
BI Architect/Tech Lead/SME
Responsibilities:
- Worked with various technology groups and vendors for BI tools evaluation. Identified Voice of customers, defining tool selectivity metrics, grading, evaluation, selection and procurement.
- Worked on building global datawarehouse with Data Quality, Data governance and automation process built-in to provide 360 degree view of data from Oracle and SQL Server databases to ensure single version of truth across global organization.
- Closely worked with business users in evaluating various solutions using Qlikview for enterprise business intelligence and reporting solution.
- Produced Business Proposal document, Technical architecture requirements, Business requirement documents, Data flow diagrams, Use Cases and worked on getting approval from stakeholders.
- Plan and outline strategy for horizontal growth of the tool and standardization across the organization.
- Building a continuous process for Qlikview Application monitoring, alert mechanism, research and improvements of the processes for better performance and reduced resource utilization.
- Collaborate with Business users and project team members to create working BI dashboard/visualization prototypes.
- Worked with interdisciplinary teams for delivering various corporate level data workflows, data capture and integration with using SSIS Data transformation packages.
- Identifying and delivering data visualization components using Qlikview for providing 360 degrees view to the users through effective navigation, drilldown, drill-up, slice and dice, set analysis, trend and actionable components.
- Worked with various multidimensional data components for defining cross dimensional data visualization for use in Management Dashboards.
- Worked on identifying IT and business components and managing coherency between IT and business goals and building scalable self-service BI solution.
- Designing and setup of automation refresh schedule through dependency and availability of the data through autosys.
Confidential
BI Tech Lead/SME
Responsibilities:
- Worked on developing a new system for the reporting environment for better performance and full automation.
- Responsible for designing new architecture for actuate reporting system and custom security with ClearTrust.
- Designing the architecture for their internal and external reporting systems.
- Prepared the upgrade plan from the existing Actuate version 8 to Actuate 10.
- Working with the users on identifying the new requirements for collateral, loans and risk summary reports and proposing the technical solution for meeting the business needs.
- Working with development teams in managing the release plan. Worked on the migration of the reporting system.
- Produced Technical requirement and Business requirement documents and socializing with stakeholders for approval.
Confidential
BI Architect/Lead/SME
Responsibilities:
- This project involves designing a web based application to display a concise view of the complete enterprise project portfolio. This also includes linking of the application with other system for root cause analysis functionality.
- Responsible for designing the Data warehouse using Informatica designer for capturing various analysis and dimension keys and measures for Management analytics application.
- Lead a cross functional team for application development and support activities.
- Designing the MDM process to ensure that global data is incorporated seamlessly into the warehouse.
- Used workflow manager for designing the ETL process and monitoring in power center.
- Designed various strategies to optimize the loading and transformation with aggregator, lookup, router, mappings etc.
- Working with PMO office in gathering reporting and dashboard requirements.
- Produced Technical requirements and Business requirement documents and worked on getting stakeholder approval.
- Gathered the user information regarding merging multiple entities and developing data migration scripts including projects, financial plans, resources, collaboration manger etc.
- Designing and Implementing Clarity Security with dashboard security for access to projects.
- Worked on designing and setting up of Business Objects universes for Projects, Financials, WIP and resources for providing quality data for batch reporting.
- Worked on gathering user requirements and development of portfolio and financial reports using Business Objects.
- Worked on creating financial reports against Niku Clarity Database accessing oracle database using clarity data model.
Confidential, KS
BI Architect/Lead/SME
Responsibilities:
- Worked on designing integration and deployment of trade data warehouse system to in corporate the various trade activities for better analysis.
- Planned and designed system architecture for Actuate 9 Environment and upgraded the Ab Initio Infrastructure .
- Gathering business requirement for new Valuation reports, Margin and ACH Reports and creating Functional requirements document (FRD) and Technical design specifications (TDS) documents.
- Worked on implementation of various schedules for DataMart scripts for daily trade and risk engine.
- Architecting the solution and development of Business reports and Information Objects/Queries.
- Integrated BI Solution with the trading data warehouse system.
- Worked on client branding solution for multiple client EPD application.
- Planning the release and deployment of the application in production environment.