Senior Big Data Architect Resume
Plano, TX
PROFESSIONAL SUMMARY:
- IT Professional with 16+ years of Experience (Big Data + Cyber Security+ IoT) Combined
- Big Data Evangelist, Outside - box thinker and Tech Savvy Techno-Functional Data Wrangler / Analyst turned BIG DATA Architect with specializations in Hadoop Eco-system including NoSQL Technology Platform with 16+ years of Cradle to Grave Experience in IT Landscape ranging from Broad-spectrum of likeIT/Retail/Telecom/Healthcare/Insurance/Logistics performing Predictive Data Analysis, Data Science Analytics, Requirements Elicitation, Data Modeling, and Use Case development from Inception (POC) to Production.
- Big Data Platform Strategist to Create a Blueprint and Roadmap from Ground-up for Clients to start their journey into Big Data seamlessly.
- Executed End-to-End IoT (Internet of Things) implementated successfully utilizing variety of IoT platforms like AWS-Lambda/ AZURE/ THINGWORX / AYLA/ XIVELY.
- IOT protocols- MQTT, CoAP, HTTP, LAMBDA Functions, Raspberry Pi, Arduino etc.
- Strong understanding of what’s “Under the Hood” of Big Data World to offer Best practices / recommendations to the clients and drive the Big Data Initiatives from Pilot - to- Production”. Self-motivated and sell audiences on solutions by performing technical product presentations and demos. Possess a balance of technical and sales aptitude, including handling customer technical questions and issues, creating product positioning and translating business needs into new product features and functionalities.
- “Get things done” attitude with a strong Boots-on-the-ground mentality to steer projects into
- Right portfolios and create Win-Win Situation for the Client and Employer.
- Knowledge of Data Center Architectures and Services incorporating Applications, Servers, Storage, and Data Center Switching technologies and ability to manage a long and complex sales cycle. Ability to leverage and converge Big Data with IoT Platform Technology.
- TOGAF Certified Architect with 3 years of Hands-on Experience in Big Data Technologies like Hadoop and its eco-system and still making my hands dirty and shaking hands with Top Management folks /Stakeholders on front end and helping the fellow Developers at the back end Cracking, Debugging and fixing the Codes.
- CYBER SECURITY Skills: IAM, Data Encryption, DLP, Risk and Compliance Management, IDS/IPS, UTM, Firewall, Antivirus/Antimalware, SIEM, Disaster Recovery, DDOS Mitigation, Web Filtering, and Security Services
- Extensive experience in Software Development Cycle from Inception to Launch, and strong passion for new emerging Technologies like Hadoop and Internet of Things(IoT)
- Ability to guide the lifecycle of Hadoop and other Big Data solutions, including requirements analysis, platform selection, Logical and Physical architectural designs, application design and development, testing, construct the Big data pipelines and seamlessly integrate the Legacy platforms/Apps into new emerging Technologies and SME for IoT Platform strategy.
- A Story Teller Blessed with Techniques to seek Sign-off from Key & tough Stakeholders through Diplomacy acting as a Liaison and Bridging the Gap between Business World and Technical World and keep all Stakeholders in Sync and involve myself in all phases of the Product Life Cycle right from Inception to launching and ensure the project deliverables within Time limits and Scope.
- Effective Communicator With Front End Stakeholders And Good Listener And A Team Player With Strong Technical Documentation Skills And Analytical Bent Of Mind Possessing Strong Willingness To LearnNew Systems, Good coordinating skills ability to Ramp up to the Speed and hit the Ground Running.
- Team-Oriented & Self-starter Professional working with Cross functional teams in a Techno functional environment playing multiple Roles as Big Data Analyst/Admin /Developer Role with strong understanding of Timelines, Budgets and deliverables and add Value to the company.
- Proactive to enter Developer’s world and willing to wrangle with lines of Code and make my hands dirty enough with hardware & Software and indulge in troubleshooting to resolve technical issues and take ownership of the Product deliverable and drive the project from end to end.
- A good Deal puncher with customer focused Negotiating abilities/solution finding skills wearing different Hats at different Scenarios. Ability to Read the Code and do the critical Analysis of Bugs and Defects and accomplish the End user’s/Business requirements.
- Self-Starter and Self-motivated Team leader, DETAIL ORIENTED and ability to think OUTSIDE THE BOX and thirst for learning new systems and adapt with changing corporate environments and increase the Bottom line of the Company. Strong in AGILE and WATERFALL Methodologies and expert in Managing “Contracts Backlog” and prioritize according to project deadlines
- Experienced in Full Life cycle implementations of Big Data solutions (HADOOP) & gathering Requirements from Clients and perform customizations/ setting up configurations and testing for Performance. Ability to de-mystify predictive analytics and create actionable insights from the output of that algorithm to add business value to the enterprise.
- Strong understanding of Software Development Life Cycle (SDLC), including good knowledge of RUP methodology. Reproduce, Expertise in SOA, investigate and debug software issues. Knowledge of both Agile and Waterfall Software development methodologies. ENTERPRISE WEB DEVELOPMENT Experience in Accu Process, SaaS (Ajax), and Informatica Cloud and Extensive knowledge of Data warehouse Concepts. HP (PPM) and Business Objects (SAP).
- Extensive experience in interacting with Offshore Team/Virtual team & Stakeholders, Problem Management Analysis, Eliciting Requirements and creating Business Requirement Documents (BRD), User Requirement Specifications, Functional Requirement Documents (FRD), System Requirement Specification (SRS), Test Plan, analyzing and Creating Use Cases, Use Case Diagrams, PROCESS FLOW Diagrams, BPMN Diagrams, Activity diagrams, System Workflow. Strong Understanding of all Versions of SharePoint.
- Experience in applying Rational Unified Process (RUP) methodology using Modeling and requirement documentation tools such as MS Office, MS VISIO, and MS Project.
- Conducting and facilitating JAD Sessions and communicating the Concepts with Key Stakeholders, Development team, SMEs, System Analysts, Business Analysts and Project Managers and External Vendors.
- Experience in customizing the portal sites on Share point 2007 and 2010, Rational Requisite Pro, Team Foundation Server and Variety of e-Commerce applications, Cloud Computing and Deployment of Apps. Strong understanding of SOA architecture designs and concepts
- “Cradle to grave” experience in big data initiatives and strategic planning from concept to launching.
- Big Data Practice / Big Data Architecture / SME for IoT-Internet of Things strategies.
- Strong understanding of IoT Platform like Axeda, Thingworx, Microsoft Azure. Working knowledge on IoT Strategy, standards, protocols like MQTT, COAP. Exposed to Embedded device programming Device protocols and wearable technology to Sensor analytics
- Working Knowledge and demonstrated experience in IoT, Integration or Cloud integration Architectures
- Data Integration & Warehousing Data Science E-Commerce Web Analytics
- Design Blue print and create a Road map for Big data journey incorporating Big Data Concepts Cradle - to-Grave understanding of the Hadoop eco-system and Reference Architecture, Hadoop-(HDFS), Write MapReduce Jobs & Algorithms Using Various Tools Pig, Hive, Hbase, Impala, NoSQL, Cassandra, IBM Data Explorer. Zookeeper, Sqoop, Flume /R Program/Pentaho / Vertica/Informatica /Talend/ Teradata Aster. Architecting and creating NoSQL Databases.
- BIG DATA TECHNOLOGY STACK: HDFS,YARN, HIVE,PIG, SPARK,STORM, CASSANDRA, HBase
- MONGO db, ‘R ”Machine Language, PENTAHO, TALEND, TABLEAU, HYPERION.
- Hadoop Cluster Administration, Configuration, Monitoring, Debugging, and Performance Tuning. Ability To Implement Hadoop Based Solutions And Offer Best Practices In Big Data Space.
- Hadoop Eco-System: Setting Up Clusters/Nodes/Maintenance And Tuning Of The Cluster Nodes End-To-End, Troubleshoot The Technical Issues And Offer Solutions.
- Develop Solutions For (IoT) Internet Of Things And Provide End-To-End Support From Concept To Production Phase Of The Product Life Cycle.
- Build Security Firewalls, Data Encryption, Kerberisation, Data Breach investigations.
- Strong Understanding Of Hadoop Eco-System And Other Business Intelligence And ETL Tools On Top Of Hadoop Like Vertica, Pentaho, Sqoop, Oozie, Flume, Hbase, Tableau, Teradata, Datastax Datameer And Mahout(Machine Learning) Web Analytics (Omniture).Mpp (Massive Parallel Processing) In Teradata and Data Modelling software ERwin 8.2.
- Ability to write MapReduce Programs and create Business Intelligence reports from the output.
- USE CASE Modeling & Analysis, Troubleshooting software bugs and defects management
- Daily support of several Hadoop, data warehouse appliances, including monitoring capacity, throughput, health, and usage and Clickstream Analysis out of the web logs.
- Collaboration with vendors and users to coordinate and accomplish repairs, upgrades, patches, and other enhancements, additions, or replacements.
- Scripting to deploy monitors, checks, and other sys admin function automation.
- Production Support for any problem leading to acceptable resolution, including daytime, nighttime, and weekend support if required.
- Oversee installations, monitoring and managing change to servers (Overall Change Management for Servers).Oversees implementation of security guidelines in order to prevent unauthorized access to servers and report any violations.
- Collaborate with System Engineering, Network Engineering, solves complex and recurring operational issues and develops corrective actions, as needed.
- Interact regularly with Metrics team, developers, engineers, and the IT outsourcer to ensure the Company’s Reliability, Availability and Serviceability (RAS) metrics are sustained and improved from current level.
- Participate in the evaluation, recommendation, and selection of hardware and software solutions. - Reviews, evaluate, designs, implements and maintain internal and external data.
- Identify data sources, constructs data decomposition diagrams, provides data flow diagrams and documents the process.
- Writes codes for database access, modifications, and constructions including Map Reduce Programs, Pig/Hive Scripts, SQL-H, Stored Procedures, etc.
- Developed and reviewed project plans, identifies issues, resolves issues, and communicates status of assigned projects to end-users and project Stakeholders.
- Experience in operational support and hands-on implementation Hadoop based Big Data Platforms. Gathers requirements, builds logical models and provides quality documentation of detailed user requirements for the design and development of systems.
- NoSQL Databases: Hbase/NoSQL/SQL Server, Oracle/IMPALA /MongoDB, CASSANDRA.
- Setup Stakeholders Meetings, JAD sessions, Workshops, Technical Pre-Sales, White Boarding and PowerPoint presentations to the Board members and Lead the Team to provide thought process leadership and direction and hold accountable for the project deliverables within the budgets and timelines throughout the life cycle of the project and avoid the Scope creep and add Clarity to the tasks and communicate the concepts and update the status reports to the front-end stakeholders. Speak both Technical and Non-Technical languages and showcase both sides of coin.
- Exhibit Business Etiquette and People Skills, Networking With People, Connect The Dots And Get Things Done. Facilitate JAD Sessions, Power Point Presentations of New Products and Services to wide spectrum of audience/ Clients and Business Users / Stakeholders.
- Shake hands with Stakeholders and sell the Concepts, Vision, Scope, and SRS documents/Strategy sessions with Stakeholders and setup One-on-One sessions to seek clarity.
- Gap Analysis, Impact Analysis and SWOT Analysis/Feasibility Analysis, Product Marketing &Sales.
- Focus on End Game and Prioritize the Requirements and achieve goals within Timelines and Budget and avoid Scope creep.
- Business Process Analysis & Research using i-Rise Software/Enterprise Architect.
- Functional Requirement Gathering & Technical Requirements Development and Documentation.
- Prototyping / Wireframes & Mockup Screen Creations & Power point presentations to any audience.
TECHNICAL SKILLS:
- BIG DATA Practice HADOOP - NoSQL BIG DATA ARCHITECTURE IoT-INTERNET OF THINGS DATA INTEGRATION CLOUD MIGRATION/ARCHITECTURE PREDICTIVE ANALYTICS POC PRE-SALES / CLIENT MEETINGS PRESENTATIONS SANDBOX
- SPARK, KAFKA, HDFS, NoSQL, Cassandra, HBase, MongoDB, Tableau, Pentaho, ClickView, Sync sort R Language, Talend, Alteryx. SPLUNK
PROFESSIONAL EXPERIENCE:
Confidential, Plano, TX
Senior Big Data Architect
Responsibilities:
- Designing and Deploying Big Data Applications into Cloud Environments.
- Creating Architectural Blueprints and Documentation for the Potential clients and baseline the Offerings to the HPE platform and Portfolio enhancements.
- Provide Thought process leadership to the Team and play crucial role and add value to the company.
- Setting up Proof of Concepts (PoC) of various tools like Tableau in AWS environment and test and establish the benchmarks and then document the same on Technology Platforms.
- Take the ownership of the Big data initiatives and drive the Project and drive from End-to-End providing thought process leadership offering solutions and recommending best practices with Pros and Cons after understanding clients Business problem statements and then build a Business Case and showcase ROI and make Business value proposition to the client and engage the client throughout the Life cycle of this big data initiative.
- Interact with the third party Tech vendors (Hortonworks, Cloudera, MapR) on a day to day basis to resolve any technical issues and make sure all are in Sync with the workflows.
- Design and architect an end to end data pipeline including data ingest, data transformation, loading and extraction for various types of data sources and integration with Enterprise environment. The design should cover non-functional aspects such as scalability, high availability, security, multi-tenancy, fault tolerance, and elasticity.
- Work closely with business users, IT team and development team to translate the business requirements into technical requirements and articulate the design sessions.
- Participate and setup one-on-one meetings with potential clients and stakeholders and help bring Conceptualization to Productionilse the Blue prints.
- Train the Co-workers and empower them to transition to new Technology challenges. Setup the Presentations to various audiences and knowledge sharing activities.
- Achieve the Deliverables within the Budgets and time lines and avoid the Scope creep.
Confidential
Security Architect (Big Data/ IoT)
Responsibilities:
- Cyber Security and Big data applications Security and Deployment.
- Big Data Architect & IoT Solutions leveraging Hadoop stack of Technologies.
- Setting up Perimeter Security and Firewall configurations and Penetration testing and PAM.
- Integration & Security Practices / Predictive Analytics using SPLUNK PLATFORM.
- Responsible for designing architecture solutions specific to the on-boarding of a set of partner applications/services within the IoT product portfolio in manufacturing floors on Global scale.
- Participate in Presales activities and internal initiatives as required such as preparing technical collateral for partners’/customer’s proposals and company presentations.
- Creates high level application design documentation, as needed, to provide guidance to IoT Platform architects and developers for the creation of system requirements.
- Manage partner architecture solutions post implementation by working with partners and Thingworx Platform teams to align partner feature capability needs by creating and maintaining partner roadmaps and Ensures solutions are scalable and meet overall business requirements.
- Cloud software architecture, communication protocols, embedded systems, and low power/restricted environment systems, IoT industry bodies Thingworx IoT Products/Services, 3rd Party IoT Services/Applications, Application Development and IoT Devices support.
- Data ingestion of Sensor data into Cloud database, big data analytics, NoSQL Databases-Cassandra
- SPARK Streaming, KAFKA and TABLEAU for live Real-time streaming and visualizations and Real-time actions to trigger notifications and system alerts to responsible personnel.
- Protocol Technologies: HTTP, JMS, AMQP, MQTT, KAFKA etc
- Web Service Technologies: SOAP, REST, WSDL, XML, etc
- Program/Scripting Languages: JavaScript, Groovy, Python, JSON, etc.
- M2M Platform Services: Connectivity Management, Application Enablement, Device Management, Gateways - Arduino, Raspberry Pi, Intel Edison, ARM.
BIG DATA TOOLS: HDFS, SPARK, SPARK STREAMING, KAFKA, CASSANDRA, TABLEAU
Confidential, Charlotte, NC
Big Data Manager
Responsibilities:
- PCI /Fraud Detection /No SQL integration/ Data Blend/ Real-Time 380° Customer View/ Personalized offerings /Risk Management / Contact Center Efficiency Optimization/ Customer Segregation/ Customer Churn Analysis / Sentiment Analysis hashtag + Social media, Security Firewall setup.
- Create a Strategy based on the Use cases and take ownership of the project from End-to-End.
- Focused on PCI-DSS -SOX federal audit compliance documentation and mitigation techniques.
- Create a Data model and design the appropriate Technology Stack of Big data tools and the stress testing and validate the results and establish Benchmarks and showcase the ROI.
- Used SPLUNK as Real-time Monitoring and analyzing Machine Data from disparate sources and trigger alerts for fraudulent patterns of user activities.
- Providing thought leadership and helps drive Big Data within the organization.
- Designs and develops data pipelines built on Apache Spark, and Hadoop that can perform at scale.
- Collaborates with other Big Data Engineers, Architects, and Data Scientists to achieve goals.
- Client facing role setting up Architecture Designs scoping and strategies / Showcasing ROI.
- Team Lead- Real-Time Data Ingestion using Big data stack of technologies(STREAMING SPARK)
- Own and establish a Reference Architecture for Big Data Blue print and create a road map for a centralized operations in coordination with all verticals and cross functional teams.
- End-to-End ownership for Security /Firewall layers and Kerberisation of all 49 clusters.
- Prepare documentations and arrange Power point presentations /white boarding and Train and mentor developers towards solution development and POC execution.
- Participate in Strategic discussions about Data Integration, Data Ingestion, Data ETL / ELT
- Design Real-time processing data Pipelines and trigger Proactive System notifications in Fraud detection event processing.
- Created personalized service offerings through data mining technology using R Algorithms.
Technologies: Hadoop Data Lake, HIVE, SQOOP, SPARK, KAFKA, CASSANDRA, TABLEAU, PENTAHO, Protegrity, Vormetric- Integrating with SOCIAL MEDIA to create actionable insights and real-time alerts& and create Personalized offers / portfolios to enhance customer loyalty.
Confidential, Santa Clara, CA
Big data Solutions & IoT Architect
Responsibilities:
- Big Data POC Project Role & Responsibilities. End to End Ownership and accountability.
- Data Migration. Took ownership of the Project from Pilot /Solution - To- Production.
- Big Data POC Development on AWS-Amazon Web services/Cloudera.
- Data Ingestion / On-Premise Data Integration, Tech Support and Documentation.
- Created a “Data Lake”-Data Migration of existing data from disparate systems and sources into Hadoop Data Lake using various tools like Sqoop (Hadoop) into HDFS/Hive. Tools: Sync sort.
- Design and implement solutions to address business problems and showcase the ROI/Business value proposition and consensus with the client requirements.
- Drive Proof of Concept (POC) and Proof of Technology (POT) evaluation on interoperable technology platforms and seamlessly migrate the Legacy Apps into Big data platform.
- Train and mentor developers towards solution development and POC/POT execution
- Communicating the AWS concepts /Business value /ROI to the top management
- Tech support and Offshore team interaction and taking ownership of the project and driving project deliverables from end to end.
- Vendor selection process evaluation and presentations.
- Use Cases: Discovery of Internet of Things (IoT), Regression Analysis and created Predictive Models based on the Sensor Data from the Production floors lead to Operational efficiency and avoid Product recalls. Integrated Gateway for sensor data and ingested into Data Lake for real-time Analysis and created actionable insights using Spark Streaming MLib Algorithms.
- Machine language/Unstructured data to create insights and establish benchmarks for production in real - time and laid down strategy for Low cost data storage
- Real-time Query response for end-users enabling them to take decisions on the Fly.
Confidential, Deerfield, IL
Big Data Architect
Responsibilities:
- Architect Role / Hortonworks, Hadoop Infrastructure setup and POC to Production implementation. Vendor selection and Tech support.
- Setting up POC Hadoop Cluster (45 Nodes) from ground-up and setup the visibility of the data flow from end to end. Provide technical direction in a team that designs and develops path breaking large-scale cluster data processing systems.
- Testing, Fine-tuning and Diagnosis of Clusters, Applying fixes Configuring Benchmarking Capacity planning Disaster/failure recovery automation Detection/repair of data corruption and Optimize the Cluster for better performance. Interact with the Vendor (Cloudera) for any Technical issues.
- Maintain the cluster with detailed information to support the sales teams and then identify trends, forecast from reports, understand and highlight anomalies and improve performance within each sales division and be comfortable working with both technical and non-technical groups.
- Hadoop Production Support, Change Management, Maintenance, Capacity Planning, Compression techniques, Performance Component verification Plan production cut-over/deployment and recommend the best practices in the industry and End-to-End execution of the project from conceptual beginning to final output and seek the solutions for the Technical issues encountered during the production phase.
- Gathered requirements, built logical models, and providing documentation, Benchmark systems, Analyze system bottlenecks and
Confidential, Atlanta, GA
Big Data Engineer
Responsibilities:
- Efficient Truck Routing, On-Time Deliveries of Shipments, Creating Models based on sensor data from the Trucks and equipment - Regression Analysis(IoT) Remote Asset Management
- Big Data Analysis & Optimization / Architecture /Machine Learning Algorithms.
- (HADOOP-Proof of Concepts /ORION Big Data Project Implementation)
- Project name: ORION-(On-Road Integrated Optimization and Navigation). To Create actionable insights using the unstructured data related to Logistic telematics and crunching of big data package information, user preferences and creating an efficient routing to drivers lead to a huge savings to the tune of $50million a year @ one mile a day for every Confidential driver
Environment: HADOOP Eco-System, JAVA,.NET, Agile, MS Office, Cloud Computing.
