Bi Data Architect Resume
2.00/5 (Submit Your Rating)
SUMMARY
- Strong blend of Analytics & BI skills of Informatica Power Center Designer (Big Data Edition), Microsoft Business Intelligence (SSIS, SSAS, SSRS), SAP Business Objects and Big Data tools/technologies, Informatica, played number of technical roles in variety of projects starting from BI Developer to Data Architect/Analyst, Solution Architect, ETL Architect, BI Subject Mater expert.
- Strong hands on experience in End - to-end Data warehouse project implementations under various roles like Architect, Analyst & Developer.
- Played Technical Architect, managing no. of projects across the organization including enterprise data warehouse, Big data initiatives, and existing data integration projects which on-demand.
- Hands on development experience in Tableau dashboard, reports based on various business needs.
- Hands on experience over Big Data tools (Hadoop cluster management, HiveQL, Impala).
- Extensive knowledge in various reporting objects like Facts, Attributes, Hierarchies, Transformations, filters, prompts, Calculated fields, Sets, Groups, Parameters etc., in Tableau.
- Experience in writing SQL queries to process some joins on Hive table and No SQL Database.
- Experience in Agile Methodology, Management tracking and bug tracking using JIRAWorking experience on designing and implementing complete end-to-end Hadoop Infrastructure including Pig, Hive, Sqoop, Oozie and Zookeeper.
- Proficient experience in working on conceptual, physical and logical data models using various Data Modelling tools like Erwin, MS Visio, Power Designer.
- Hands on experience of Universe design and Report design with Business Objects 4.0.
- Excellent T-SQL Developer skills including Stored Procedures, Indexed Views, User Defined Functions, Triggers, and Distributed Transactions .
- Involved with hands on for the Cube performance tuning tasks, deployment strategy, storage modes, DSV creation etc.
PROFESSIONAL EXPERIENCE
Confidential
BI Data Architect
Tools: /Technology: VERTICA, TeraData, Talend Data Integrator, Microsoft SSIS, Informatica PowerDesigner, Tableau
Responsibilities:
- Work closely with team of data scientists in various business use cases to improve over user engagements on various user activities, clicks and impressions, keep track of user behavior on various new offers, generate models to determine the best offer for specific user group based on online behavior, help converting the clicks and impressions to orders (revenue) with help of advanced data analytics algorithm along with team of data scientists.
- Implemented Multilayer ETL Data Extracts, Data Load Logic, Mapping, Work Flows, stored procedure.
- Prepare and design ETL architectures for various enhancements received from business or IBM streams.
- Create Mapping diagrams, high level and low level designs for various features.
- Effectively articulated reasoning for data model design decisions and strategically incorporated team member feedback to produce the highest quality data models.
- Worked with project and application teams to ensure that they understand and fully comply with data quality standards, architectural guidelines and designs.
- Create critical dashboards for CXO level dash boards in Tableau for Customer recommendations based on user behaviors.
- Design various ETL packages/Mapping in SSIS, Informatica and Talend to push/pull data from different sources (Internal or external) into Vertica.
- Create ETL Mapping spreadsheet for large datasets.
- Performance enhancement for various ETL Processes.
- Understand various SPSS and SAS modeling recommendations, and based on some critical recommendations, prepare automated reports which helps VP to drive the changes in current offers, propose new offers or re-think of low performing offers or content.
- Prepare highly complex decision trees based on statistical use case and make it implementable in Tableau or Adobe Analytics based on requirements.
Confidential
Technical Architect
Tools: /Technology: Big Data (Impala, HiveQL), Microsoft BI (SSIS, SSRS), Big Insight, BIG SQL
Responsibilities:
- Collect data from various sources of agencies and integrate them into Central repository using Microsoft SSIS.
- Worked with Sqoop to move (import/export) data from a relational database into Hadoop and used FLUME to collect data and populate Hadoop.
- Involved in data driven strategic initiatives for various business needs and lead the entire life cycle of project/implementations.
- Managed data modeling activities for the new projects, creating ETL Mapping, creating technical documentations.
- Performed Data analysis for the new project for the feasibility study, efforts, ETL strategy etc.
- Gained knowledge of Hadoop Architecture and various components such as HDFS, Name Node, Job Tracker, Data Node, Task Tracker and Map Reduce concepts.
- Building analytical cubes based on various business needs using Microsoft SSAS.
- Designing and Implementing straggles for Cube processing types for various analytical cubes based on business requirements.
- Performance tuning of SQL queries and stored procedures using SQL Profiler and Index Tuning Wizard in SSAS and SSIS.
- Involved in Batch Job scheduling using MAESTRO and TIVOLI for 60+ ETL Jobs.
- Designing ad-hoc reports in TableU for various ad-hoc business requirements.
- Involved in SSAS cube performance activities and new cube design, understanding and translating business requirements and determining the needs to cube elements.
- Involved in Writing of Big data/HIVEQL scripts for various analytical needs for Claim Analysis, fraud analysis, disability premium analysis.
- Successfully achieved ETL jobs with SSIS to push data into HIVE tables with the help of Impala.
- Used Cloudera GUI for Cloudera Administrations purpose.
Confidential
DW/ETL Architect
Responsibilities:
- Data consolidation from various legacy systems to SAP System of Publicis called ALTAIR.
- Involved in
- Working as ETL/DW Architect providing architectural support as well as individual contributor as ETL developer for package development using SSIS.
- Involved in Cube Partitioning, Refresh strategy and planning and Dimensional data modeling in Analysis Services (SSAS).
- Preparing Dashboards using calculations, parameters in Tableau.
- Have working knowledge of PowerBI, helped in performance tuning tasks with different teams from Visualization using in PowerBI.
- Owned the entire process of data pulling from different sources and involved in HDFS maintenance and loading of structured and unstructured data.
- Created and worked Sqoop (version 1.4.3) jobs with incremental load to populate Hive tables.
- Involved in creation of SSAS cube, partitions, key column, value column, calculated column.
- Involved in creation of Data source view, performance tuning for the derived column and tables, creating calculated members, named sets for the SSAS cubes.
- Making sure of availability, capacity, and performance of Hadoop infrastructure.
- Good understanding of the distributed computing and Hadoop base architecture
- Manage & Support Hadoop Cluster, cluster failover activity, capacity management, Monitor and support cluster using Cloudera.
- Created and Designed Data Source and Data Source Views Using SQL Server Analysis Services 2008 (SSAS).
- Design and development of the strategy for overall ETL development.
- Written code using HiveQL for different KPI building and decision making areas.
- Provided solution to offshore team in need basis for the complex scenarios in SSIS and SQL server.
- Created named query in SSAS Data Source View to get appropriate hierarchical summarize data for two dimensions.
- Active interaction with user community and stockholders to understand the business requirement, convert it into technical documentations and provide the solution to offshore team for ETL and Reports.
- Involved actively in coordinating between customer, different vendors and onsite/offshore team members in various technical roadblocks and respective deliverables.
- Through understanding of enterprise data warehouses using Kimball methodologies.
- Widely used Database objects like Views, Synonyms, Triggers and Global Links for reducing the load on ETL tools.
- Tuning the ETL for Terabyte loads for Optimum Performance, Dependencies and Batch Design.
- Generated and published Meta data reports for the data model ETL and Reporting for the business users.
Confidential
BI/DW Developer
Tools: /Technology: Business Objects (BO 3.1, 4.0), SSIS, SSAS, COGNOS,HP Service Manager 9/Service Center
Responsibilities:
- Worked with various teams of Confidential across APAC region to co-ordinate various Data management practices, report availability, data analytics strategy, Change and release management, onshore/offshore co-ordination along with individual contribution on multiple projects/engagements.
- Implemented and Involved in Onshore- Offshore Delivery model during design, development and testing phase.
- Implemented a large scale Reporting solution with the blend of SSIS and Business Objects to fulfill the business needs.
- Extracted data from various sources like SQL Server 2008/2012, Oracle, .CSV, Excel and Text file from Client servers and through FTP.
- Handled Performance Tuning and Optimization on SSIS and MDX, with strong analytical and troubleshooting skills for quick issue resolution in large-scale production environments located globally.
- Understanding business functions from the functional owners of a report or group of report and developing high level architecture for the development of the same.
- Implemented Data warehousing schemas like Star-Schema, Snow-Flake Schema.
- Involved in the designing and Building of Universes, Classes and objects
- Resolved loops and Traps using Contexts and Aliases while designing Universes.
- Set up Universe level conditions to restrict data based on report requirements.
- Achieved row level data security on few sensitive data for higher ups of the org.
- Developed and created classes with Dimension, Detail & Measure objects and Developed Custom hierarchies to support drill down reports.
- Hands on knowledge on various aspects of Cube development, storage modes (ROLAP, OLAP AND MOLAP), measure and measure groups and its types.
- Hands on knowledge on different types of relationships between a Cube Dimension and a Measure Group, proactive caching etc
- Created Derived tables in the Designer to enhance the capabilities and performance of the universe
- Modifying the existing universes, Adding new objects editing joins, mapping the existing objects with renamed tables, Reviewed & Tested packages, fixing bugs (if any) using SQL 2005 Business Intelligence Development Studio.
- Studied existing OLAP system(s) and created facts, dimensions and star schema representation for the data mart and warehouse systems.