Sr. Big Data Technical Architect Resume
San Francisco, CA
TECHNICAL SKILLS:
Database: MySQL and Microsoft SQL Server2005, SQL Server 2008 R2, MS. ACCESS. NoSQL Technologies (combination of Hadoop, Cassandra and MongoDB). Hadoop eco system (PIG, zooker, HBASE, YARN, HIVE, SQOOP, Impala, Storm, Flume) Advance Analytic (Data Science): R, R - Hadoop
Modeling and Designing DB Tools: IBM Rational Rose, MS. Visio, ERWin 7.3-, 8, 9.5, Quest TOAD.
Data Warehouse: Informatics ETL Power Builde7.1,8.6r, SSIS,SAP BODI,ODI, Pentaho
BI Tools: SAP BO, IBM COGNOS 10 BI, SSAS,SSRS
Platforms: Microsoft Windows 2003/2008, Mac OSX, and Linux
WEB Server: Apache HTTP Server 2.2.17, MS. IIS 7.5.
Requirement Management Tools: IBM Rational Requisite Pro V 7.1.2, Rational DOORS 9.0.Dashboards, Strategic maps, Balance Scorecards, qlikviewData Mining, Predictive Analytics, Heat Maps
HP Quick Test Pro 8.2: 11.0, HP LoadRunner 8.0-11.0, HP Service Test 11.0, JMeter 2.4, Selenium RC, JUnit.
Configuration Management: Microsoft VSS, CVS, IBM Rational ClearCase.
Database: MySQL and Microsoft SQL Server2005, SQL Server 2008 R2, MS. ACCESS. NoSQL Technologies (combination of Hadoop, CouchBase, Cassandra and MongoDB
Concepts: Zachman Frameworks, TOGAF
Engagement Experience: Multi-million Dollar projects. Client Facing Techno Functional Consultant for fortune 500 companies.
Query Tools: pache Drill, SSMS, TOAD
Methodologies: SCRUM, AGILE, RUP, Waterfall
Microsoft Tools: MS. Project, MS. Office, MS Sharepoint, MS Frontpage, MS VISIO
PROFESSIONAL EXPERIENCE:
Sr. Big Data Technical Architect
Confidential
Responsibilities:
- Big Data Analytics - Apply big data technologies such as Hadoop with NoSQL data management and related programming languages such as HBase, Sqoop, Pig, Hive and R for deep analytics and experimentation with large, multi-structured data sets. Analysis, architectural design, prototyping, development, Integration and testing of applications using Java/J2EE Technologies. Areas of optimizing management and driving insights from unstructured non-relational data, and provide business value from content through improved information management, business process optimization (BPM) and advanced analytics.
- Apply semantic correlation, ontology and text analytics techniques using R and systems to analyze unstructured data and identify critical insights for overall business analytics across various domains of clients. Designed solutions based on analysis of text, streams, documents, social media, big data, speech and video with emerging NoSQL, Natural Language Processing, Search and Text analytics technologies and techniques. Provide Architectural support for multiple business cases for Big data clients and build security architecture around organization strategic data assets. Participate in all aspects of software life cycle including analysis, design, development, unit testing, production deployment and support, including creating formal written deliverables and other documentation. Define key business problems to be solved, formulating approaches and gathering data to solve those problems, develop/analyze/draw conclusions and present solutions.
Sr. Big data Technical Architect
Confidential
Responsibilities:
- Conducted extensive analysis of source data that helped architect a new solution for Large Group Business. Build a roadmap, cost estimates and implementation plan to deploy the member 360 data refinery. Led discovery sessions to understand the existing and desired business processes, served as a liaison between multiple departments to facilitate project scoping, requirements gathering for new business process by conducting deep analysis and data profiling that helped architect new solution for Large Group Business, utilizing the Hadoop stack. Involved in writing Java API’s for running the apache Spark.
- Created Sqoop Ingestion models for all databases and using Flume to ingest data from social website like twitter and facebook. Designed solution and led Developers and Architects to architect Enterprise Information architecture Big Data Customer 360 use case. Designed the next generation data architecture for the unstructured data at BCBSM. Added Kerberos - based authentication to Name Node and Job Tracker, added delegation token to HDFS.
Sr. Big Data Technical Architect
Confidential
Responsibilities:
- CA Confidential Financial Services (HP Financial Services) operating for over 20 years. The world’s second-largest captive IT financing company - over $12 b in assets, $7b a year in new financing, $4b Rev. Global capabilities with over 1,500 employees in more than 50 countries. Responsibilities: Led large teams of enterprise architects involved in planning, development and implementation of infrastructure improvement strategies, providing implementation expertise for a variety of Big Data technologies. Devised and led the implementation of the next generation architecture for more efficient data ingestion and processing. HDFS Cluster was designed to ensure that all authenticated user and services such as Oozie, must authenticate and that tasks run with the privileges and identity of the submitting user. Kerberos was used as the underlying authentication service and all principals authenticated using their system credentials.
- Analyzed data by performing Hive queries and running Pig scripts to combine data set. Implementing business logic by writing PIG UDF’s in java and handle ingestion of data from various data sources, using FLUME and SCOOP and performing transformation using Hive, Map Reduce, while loading data into HDFS Train middle and top management who were stakeholders in Enterprise wide projects. Managed and evaluated proof of concept implementations and champion platform strategies across Architecture, Engineering, Strategy, Product Development, and Marketing organizations. Worked with Qlikview developers and architected Data models for their extensive reports using combination of charts and tables.
Sr. Technical Architect
Confidential
Responsibilities:
- Interface with a variety of business and technical stakeholders to gather and solidify requirements for data analysis and report generation, create the queries to extract data, interpret data, and develop test and deliver the report solutions.
- Managed project budgets, risk analysis and mitigation, issue management, root cause analysis, project schedule development and tracking using MS. Project. Enterprise data modeling, Logical Data Modeling, Physical Data Modeling. EDW, OLTP, OLAP, ODS, Metadata Management. Customer Support Strategic Dashboard Project is to build a Corporate Dashboard to help the Customer Support Managers to analyze the performance of the Technical Support Engineer based on various Key Performance Metrics at World Wide and also at the different regional level. Designed and developed the metric universe for the Dashboard project.
Sr. Data Architect
Confidential - San Francisco, CA
Responsibilities:
- Led efforts for the BI infrastructure in roles ranging from Data Architect to Informatica ETL Developer. Worked on designing Data Architecture based on the 3-Level data model Architecture using Industry Best-practices contents, comprehensive design, attribution and definitions. The JP Morgan Investment Banking Business model consisted of core entities including, Brokerage services, financial advice and counseling, mergers and acquisitions services, Market-making, bridge financing, foreign exchange services, trading intermediary services for clients, asset management, securities swaps, repurchase agreements, forwards, futures and options. System owner and developer of the Corporate Budget and Forecasting application.
- Facilitated better sales analytics by distributing personalized OLAP cube slices nightly to 150 regional Bankers, allowing users full multi-dimensional analysis from within Excel while disconnected from the corporate intranet.
- Provided strategic guidance for warehouse implementation, development practices, report distribution methods, performance improvement, and disaster recovery.
- As a Senior Business Intelligence Solution Architect in defining, developing, and enhancing of two separate custom DW/BI systems for the corporate loans and investment business units. Primary responsibility included planning, and conducting interviews with various business users in identifying and capturing business requirements, defining and developing project scope/plan, creating and developing two separate multi-dimensional data models using the Ralph Kimball methodology.
Sr. Data Architect
Kohl - San Francisco
Responsibilities:
- The purpose of the project was to redesign its ecommerce site KIohls.com with new features and a clean navigation of the site. The project also involved a complete revamp of the site search, zoom features and its shopping lifecycle. Responsibilities: Created artifacts defining key architectural considerations for Kohl’s Enterprise Business intelligence (BI) and data warehouse system that provides data import, ETL, data analytics, and data mining services for the 40 states Sales force.Designed, developed, and implemented a Decision Support System (DSS) for the Personal Product Division at Kohl’s Inc.
- The DSS application was designed in a two-tier client/server environment that integrated various platforms - OS/2, Windows 2003, IBM 3090, and AS/400. Responsibilities: Tasks included the design and development of a Graphical User Interface (GUI) using Commander Builder (EIS), the design and development of a multi-dimensional database (MDDB) application using Essbase that stored and maintained company's financial, marketing, and sales data, and the loading and manipulation of the data into the database using ADL and Essbase SQLInterface.
- Additional tasks included installation, configuration, and support of package software, users, documentation, and production support.
Sr. Data Architect
Confidential
Responsibilities:
- Over 700 employees will be using the Siebel system to manage customer and dealer accounts and relationships.The application provides a common platform for many different departments like collections, transaction processing, Lease customer network, Legal etc. to process various business processes and customer requests. Responsibilities: Served as a Data Warehouse Specialist/Consultant in the enhancement and development of enterprise data warehouse/business intelligence application for the automotive services at Nissan Corporation. Primary focus and/or development efforts were in the reengineering of the ETL processes using the Oracle Data Integrator (formerly known as Sunopsis) tool.
- Analysis, design, and construction of the enterprise data warehouse environment for retail analytics (marketing, sales, financials credit, etc.).
- The overall architecture included a staging area for the near-real-time import of source and external data, an operational data store aimed at data reconciliation, data normalization, data cleansing and business rule enforcement, and a data mart structure intended for the bulk of the reporting needs upon periodic snapshots (and aggregations) of ODS data.
Database Architect
Confidential
Responsibilities:
- Served as a Data Warehouse Specialist/Consultant in the enhancement and development of enterprise data warehouse/business intelligence application for the automotive services at Nissan Corporation.
- Primary focus and/or development efforts were in the reengineering of the ETL processes using the Oracle Data Integrator (formerly known as Sunopsis) tool. Analysis, design, and construction of the enterprise data warehouse environment for retail analytics (marketing, sales, financials credit, etc.).
- The overall architecture included a staging area for the near-real-time import of source and external data, an operational data store aimed at data reconciliation, data normalization, data cleansing and business rule enforcement, and a data mart structure intended for the bulk of the reporting needs upon periodic snapshots (and aggregations) of ODS data.
- Additional tasks included the creation of schema objects (tables, views, indexes, packages, triggers, etc.) in an Oracle9i environment, the expansion of a single dimensional data model (star schema) into multiple star schemas, source to target mapping, storage capacity planning and the support and maintenance of the Cognos EP Series 7 reporting environment.