Solutions & Big Data Architect Resume
Raleigh, NC
SUMMARY:
- IoT - Internet of Things Evangelist and Tech Savvy Techno-Functional Big Data Architectwith specializations in IoT Technology Platformand Big Data Technologies with 16 + years of Cradle to Grave Experience in IT Landscape ranging from Broad-spectrum of Industries- like Logistics Supply Chain IT/Retail/Manufacturing /Telecom/Healthcare/Insurance/ performing mission critical Real-Time PredictiveData Analysis and IoT Solution creation.
- End-to-End IoT Solution providerData Science Analytics, Requirements Elicitation, Business Modeling, and Use Case development,Internet of Things (IoT)Inception (POC) to Production. AWS Cloud Certified Architect providing solutions for Cloud Deployments.
- Strong understanding of IoT Eco-system Partners, Multiple IoT Protocols and Platforms.
- Currently Evangelizing IoT, Converging Big Data and IoT to create End-to-End Solutions.
- IOT Platform Strategist, IoT Architecture, and planning, M2M Platform Services: Connectivity Management, Application Enablement, and Device Management offering world leading security solutions for device protection, encryption, authentication, key management, and code signing.Strong background in sensor and actuators networks (wireless and wired), control systems, embedded systems, cloud-based platforms for IOT applications.
- Create a Blueprint and Roadmap for Clients for their journeys into Internet of Things Space.
- “Get things done” attitude with a strong Boots-on-the-ground mentality to steer projects into
- Right portfolios and create Win-Win Situation for the Client and Employer.
- A Story teller Blessed with Techniques to seek Sign-off from Key & tough Stakeholders through Diplomacy acting as a Liaison and Bridging the Gap between Business World and Technical World and keep all Stakeholders in Sync and involve myself in all phases of the Product Life Cycle right from Inception to launching and ensure the project deliverables within Time limits and Scope.
- Knowledge of Data Center Architectures and Services incorporating Applications, Servers, Storage, and Data Center Switching technologies and ability to manage a long and complex sales cycle. Ability to leverage and converge Big Data with IoT Platform Technology.
- AWS Certified Cloud Architect with 3 years of Hands-on Experience in IoT and its eco-system and still making my hands dirty and shaking hands with Top Management folks /Stakeholders on front end and helping the fellow Developers at the back end Cracking, Debugging and fixing the Codes.
- Extensive experience in Software Development Cycle from Inception to Launch, and strong passion for new emerging Technologies like Hadoop and Internet of Things(IoT)
- Ability to guide the lifecycle of IoT solutions, including requirements analysis, Platform selection, Logical and Physical architectural designs, application design and development, testing, construct the Data pipelines and seamlessly integrate the Legacy platforms/Apps into new emerging IoT Platform strategy.
- Strong understanding of what’s “Under the Hood” of IoT World to offer Best practices / recommendations to the clients and drive the IoT Initiativesfrom Pilot - to- Production”.
- Strong understanding of IoT Platform like Axeda, Thingworx, Microsoft Azure. Working knowledge on IoT Strategy, standards, protocols like MQTT, COAP. Exposed to Embedded device programming Device protocols and wearable technology to Sensor analytics
- Self-motivated and sell audiences on IoT solutions by performing technical product presentations and demos.Possess abalance of technical and sales aptitude,including handling customer technical questions and issues, creating product positioning and translating business needs into new product features.
- Effective Communicator With Front End Stakeholders And Good Listener And A Team Player With Strong Technical Documentation Skills And Analytical Bent Of Mind Possessing Strong Willingness To Learn New Systems, Good coordinating skills ability to Ramp up to the Speed and hit the Ground Running.
- Team-Oriented & Self-starter Professional working with Cross functional teams in a Techno functional environment playing multiple Roles as Big Data Analyst/Admin /Developer Role with strong understanding of Timelines, Budgets and deliverables and add Value to the company.
- Proactive to enter Developer’s world and willing to wrangle with lines of Code and make my hands dirty enough with hardware & Software and indulge in troubleshooting to resolve technical issues and take ownership of the Product deliverable and drive the project from end to end.
- A good Deal puncher with customer focused Negotiating abilities/solution finding skills wearing different Hats at different Scenarios. Ability to Read the Code and do the critical Analysis of Bugs and Defects and accomplish the End user’s/Business requirements.
- Self-Starter and Self-motivated Team leader, DETAIL ORIENTED and ability to think
- Experienced in Full Life cycle implementations of IoT solutions& gathering Requirements from Clients and perform customizations/ setting up configurations and testing for Performance. Ability to de-mystify predictive analytics and create actionable insights from the output of that algorithm to add business value to the enterprise.
- Extensive experience in interacting with Offshore Team/Virtual team & Stakeholders, Problem Management Analysis, Eliciting Requirements and creating Business Requirement Documents (BRD), User Requirement Specifications, Functional Requirement Documents (FRD), System Requirement Specification (SRS), Test Plan, analyzing and Creating Use Cases, Use Case Diagrams, PROCESS FLOW Diagrams, BPMN Diagrams, Activity diagrams, System Workflow. Strong Understanding of all Versions of SharePoint.
- Conducting and facilitating JAD Sessions and communicating the Concepts with Key Stakeholders, Development team, SMEs, System Analysts, Business Analysts and Project Managers and External Vendors.
- Strong understanding of SOA architecture designs and concepts
- Big Data Practice/Big Data Architecture / SME for IoT-Internet of Things strategies.
- Design Blue print and create a Road map for IoT journey incorporating IoT Concepts Cradle - to-Grave understanding of the IoT eco-system and Architecture.
- Ability to design a Complex Event Processing engine leveraging the Big data Technology Stack.
- Develop Solutions for(IoT) Internetof Things and Provide End-To-End Support from Concept to Production Phase of the Product Life Cycle and keep all Stakeholders in Sync.
- Build Security Firewalls, Data Encryption, Kerberisation, Data Breach investigations.
- Cyber Security Skills: (IAM, Encryption, DLP, Risk and Compliance Management, IDS/IPS, UTM, Firewall, Antivirus/Antimalware, SIEM, Disaster Recovery, DDOS Mitigation, Web Filtering, and Security Services.Knowledge of firewalls, IPS, DLP, privacy, security monitoring.
- Ability to perform Demonstration of the Product and Solutions and showcase the Value proposition and help the client understand the ROI.(Return of Investment)
- Scripting to deploy monitors, checks, and other sys admin function automation.
- Production Support for any problem leading to acceptable resolution, including daytime, nighttime, and weekend support if required.
- Oversee installations, monitoring and managing change to servers (Overall Change Management for Servers).Oversees implementation of security guidelines in order to prevent unauthorized access to servers and report any violations.
- Collaborate with System Engineering, Network Engineering, solves complex and recurring operational issues and develops corrective actions, as needed.
- Interact regularly with Metrics team, developers, engineers, and the IT outsourcer to ensure the Company’s Reliability, Availability and Serviceability (RAS) metrics are sustained and improved from current level.
- Participate in the evaluation, recommendation, and selection of hardware and software solutions. - Reviews, evaluate, designs, implements and maintain internal and external data.
- Identify data sources, constructs data decomposition diagrams, provides data flow diagrams and documents the process.
- Deep understanding of different Virtualization platforms, Public Cloud solutions and private cloud implementations using commercial and open source technologies. Enterprise Architecture, Enterprise Technology Roadmaps and Transformation Strategies, Private Cloud Implementation, Public and Private Cloud Integration, Cloud Computing GTM Solutions, Cloud Computing adoption Roadmap for Enterprises, ROI models, Performance Engineering, Grid Computing, Infrastructure Consolidation and Optimization, Enterprise Social Collaboration, Mobile Application Development
- PRE-SALES: Possess strong Business Etiquette to create a Business Case and Setup Stakeholders Meetings, JAD sessions, Workshops, Technical Pre-Sales, White Boarding and PowerPoint presentations to the Client Stakeholders and Lead the Team to provide thought process leadership and direction and hold accountable for the project deliverables within the budgets and timelines throughout the life cycle of the project and avoid the Scope creep and add Clarity to the tasks and communicate the concepts and update the status reports to the front-end stakeholders. Speak both Technical and Non-Technical languages and showcase both sides of coin.
- SETUP WORKSHOP: Be able to articulately present IoT concepts to senior management and train the client staff.
- Exhibit Business Diplomacy and People Skills, Networking With People, Connect The Dots And Get Things Done. Facilitate JAD Sessions, Power Point Presentations of New Products and Services to wide spectrum of audience/ Clients and Business Users / Stakeholders.
- Shake hands with Stakeholders and sell the Concepts, Vision, Scope, and SRS documents/Strategy sessions with Stakeholders and setup One-on-One sessions to seek clarity.
- Gap Analysis, Impact Analysis and SWOT Analysis/Feasibility Analysis, Product Marketing &Sales.
- TRAIN Co-WORKERS:Train peers and Co-workers and Focus on End Game and Prioritize the Requirements and achieve goals within Timelines and Budget and avoid Scope creep.
- Functional Requirement Gathering & Technical Requirements Development and Documentation.
- Prototyping / Wireframes & Mockup Screen Creations& Power point presentations to any audience.
- Business Requirements / Functional and Non-Functional Documentation of Technical Specs / and sign off from the Stakeholders. Setup Workshops and train and Mentor Developers.
- Documentation and Visualization using Reporting tools like /Tableau /Qlickview.
- Train Client staff and create Story Boards, Scenarios, Personas. Test Plans, Test Scripts and Test Cases, Prototypes.
- Setup Workshops for Business sponsors / end users and Developers and educate and make sure all the stakeholders are same page and understand the strategy and concepts of the IT initiatives.
TECHNICAL SKILLS:
PROTOCOLS/ PLATFORMS / TOOLS: Platforms: Thingworx, AWS, Microsoft Azure, Ayla, Predix, Evrythng Protocols:MQTT, CoAP, HTTP, DDS, AMQP, BLE, Wi - Fi, ZigBee Mesh Networks. Tools:Spark, Cassandra, Mongo Db, Pentaho, ClickView, Syncsort, R Language, Alteryx, Machine Learning Algorithms, Tableau Development Boards:Adruino, Raspberry Pi, Intel Edison, ARM.
PROFESSIONAL EXPERIENCE:
Confidential - Raleigh, NC
IoTSolutions & Big Data Architect
Responsibilities:
- Responsible for designing architecture solutions specific to the on-boarding of a set of partner applications/services within the IoT product portfolio in manufacturing floors on Global scale.
- Participate in Presales activities and internal initiatives as required such as preparing technical collateral for partners/customers proposals and company presentations.
- Creates high level application design documentation, as needed, to provide guidance to IoT Platform architects and developers for the creation of system requirements.
- Manage partner architecture solutions post implementation by working with partners and Thingworx Platform teams to align partner feature capability needs by creating and maintaining partner roadmaps and Ensures solutions are scalable and meet overall business requirements.
- Cloud software architecture, communication protocols, embedded systems, and low power/restricted environment systems, IoT industry bodies Thingworx IoT Products/Services, 3rd Party IoT Services/Applications, Application Development and IoT Devices support.
- Data ingestion of Sensor data into Cloud database, big data analytics, NoSQL Databases-Cassandra
- Protocol Technologies: HTTP, JMS, AMQP, MQTT, KAFKA etc
- Web Service Technologies: SOAP, REST, WSDL, XML, etc
- Program/Scripting Languages: JavaScript, Groovy, Python, JSON,etc.
- M2M Platform Services: Connectivity Management, Application Enablement, Device Management, Gateways - Arduino, Raspberry Pi, Intel Edison, ARM.
- BIG DATA TOOLS: HDFS, SPARK, SPARK STREAMING, KAFKA, CASSANDRA, TABLEAU
Confidential - Santa Clara, CA
Big DataSolution Architect
Responsibilities:
- Big Data POC Project Role & Responsibilities. End to End Ownership and accountability.
- Data Migration. Took ownership of the Project from Pilot /Solution - To- Production.
- Integration and creation of Data Pipeline between Sensors, Gateway and Cloud / AWS.
- POC Development on AWS-Amazon Web services.
- Data Ingestion / On-Premise Data Integration, Tech Support and Documentation.
- Created a “Data Lake”-Data Migration of existing data from disparate systems and sources into Hadoop Data Lake using various tools like Sqoop(Hadoop) into HDFS/Hive. Tools: Sync sort.
- Design and implement solutions to address business problems and showcase the ROI/Business value proposition and consensus with the client requirements.
- Drive Proof of Concept (POC) and Proof of Technology (POT) evaluation on interoperable technology platforms and seamlessly migrate the Legacy Apps into Big data platform.
- Train and mentor developers towards solution development and POC/POT execution
- Enhancement of SQL queries and making recommendations and best practices.
- POC conducted for different Use cases and documented on AWS Platform.
- Documented Functional and Non-Functional Requirements needed for Big data initiatives.
- Communicating the AWS concepts /Business value /ROI to the top management
- Tech support and Offshore team interaction and taking ownership of the project and driving project deliverables from end to end.
- Third Party Vendor network selection process evaluation and presentations
- Use Cases: Discovery of Internet of Things (IoT), Regression Analysis and created Predictive Models based on the Sensor Data from the Production floors leading to Operational efficiency and avoid Product recalls and Predictive maintenance. Integrated Gateway for sensor data and ingested into Data Lake for real-time Analysis and created actionable insights using Spark Streaming MLib Algorithms to trigger alerts and push notifications in Real-time.
- Machine language/Unstructured data to create insights and establish benchmarks for production in real - time and laid down strategy for Low cost data storage
- Real-time Query response and integrated with Tableau for end-users (C-Level managers)enabling them to visualize the data and take decisions on the Fly.
Confidential - Atlanta, GA
Solutions Architect(Embedded Systems)
Responsibilities:
- Responsibilities: Acquire sensor data from all Company remote assets and invoke Machine Learning Algorithms to create alerts and push-notifications to the Truck drivers and system maintenance personnel.Event hub processing based on sensor data.
- Project name: ORION-(On-Road Integrated Optimization and Navigation). To Create actionable insights using the Real-time data related to Logistic telematics and crunching of big data package information, user ps and creating an efficient routing to drivers lead to a huge savings to the tune of $50million a year @ one mile a day for every UPS driver
- Write JAVA / Python based Algorithm/ MapReduce jobs to trigger into MapReduce framework and with the emitted output, and analyze them to create Statistical/Graphical visualization reports as per the Business Users needs using various BI tools like Tableau/Clikview .
- Hands-On Experience in creating MapReduce Jobs and making my hands dirty by entering Development environment and troubleshooting and analyzing the end results to create actionable insights and graphical Dashboards and feed to Business Intelligence Reporting system/DSS (Decision Support Systems) using various third party tools like Tableau, Pentaho and Big insights( Confidential Data Explorer/ Teradata Aster/SPLUNK.
- Reduce asset loss. Know about product issues in time to find a solution and mitigate the risk.
- Save fuel costs. Optimize fleet routes by monitoring traffic conditions.
- Ensure temperature stability. Monitor the cold chain - according to the Food and Agriculture Organization of the United Nations, about one third of food perishes in transit every year.
- Manage warehouse stock. Monitor inventory to reduce out-of-stock situations.
- Gain user insight. Embedded sensors provide visibility into customer behavior and product usage.
- Create fleet efficiencies. Reduce redundancies and deadheadmiles.
- Strong Understanding of the IoT Protocols, Message distribution and Predictive Analysis.
- Gathered requirements, built logical models, and providing documentation, Benchmark systems, analyses system bottlenecks and
- Propose solutions to eliminate them and interact with the Vendor to raise Tech support Tickets to resolve the issues. Subdivide a complex application, during design phase. Communicated the concepts to Back end Developers and explain the dependencies
- Worked directly with UPS clients to map out their existing Business Processes and providing system-based Predictive Analytic solutions that increase efficiency and reduce operating costs in setting up automation in their newly planned system and Integrating with UPS IT environment using Big Data solutions for increased productivity, customer satisfaction & avoid Customer churn.
- ETL Jobs: Performed ETL Jobs with structured (transaction), semi-structured (user behavior) and unstructured (text) data and develop algorithms and systems before ingesting the data into HDFS using state-of-the-art open-source platforms like Talend, Pentaho, Splunk, Hive and Pig.
- Deployed multi-node Cloudera Distribution Hadoop clusters ( 60 nodes, version 1.x and 2.x) in order to prototype solutions using Mahout (0.7, 0.8) to build predictive models with data from millions of the Retail EDI -856 Transactions received in UPS database. This helps to reduce product Recalls & Shipment specifications for Pharma clients of UPS and enhance Customer Satisfaction.
- Designed BI dashboards, scorecards, charts/graphs, drill-downs, and dynamic reports to meet the needs of the top management and decision makers.
- CASSANDRA: Used DATASTAX brand of Cassandra (Peer-to-Peer) tools to handle a real-time operational data store for online transactional applications and a read-intensive database for large scale sensor data and created Graphical BI Dashboards out of the ad-hoc query output for the Top hierarchy management.
- SPARK: Expert in using Spark to perform Real-time Processing to perform Predictive analytics by ingesting the telemetric/satellite data and triggering the Operational Alerting systems and scheduled Announcement systems to the Driver on Road in Real-time .
- Clickstream Analysis out the web logs to create the actionable and meaningful insights.
- WEB ANALYTICS: Measuring and collecting off-site and on-site web logs to do analysis and reporting of internet data for purposes of understanding and optimizing web usage and enhance the KPI’s and improve the customer web browsing experience.
- Cradle to Grave understanding of HADOOP Eco-System HDFS/MapReduce, JAVA Related Projects and other Hadoop related projects like Pig Hive NOSql, Zookeeper, Sqoop, Mahout, and Cassandra. Expert in (MPP) Massive Parallel Processing architecture in Teradata.
- HADOOP Eco-System: Setting up Clusters/Multi-nodes/Maintenance/Troubleshooting and Tuning of the clusters. Involved in integrating Hadoop into existing technology stacks and software portfolios to achieve maximum Business value.
- Ability to design solutions independently based on high level architecture
- Implemented Hadoop based solutions and developed governance strategy and provided architectural recommendations on integration standards.
- Architected and Designed Solutions for the business to accomplish Business Value.
- Estimated Workload Profiles (for analytical processing, Data Processing, Ad-hoc processing etc.) ETL using various Tools like Pentaho, HP Vertica, Informatics, Hive, and Pig.
- Determined Workload Types, Data Landings, Estimate amount of data/ intervals, Determine data retention periods, any transformations, types/number of integrations, Plan compression levels.
- Strong Requirements gathering tasks using JAD Sessions & Conducting User Interviews to seek Clarity and avoid ambiguity, prepare functional documents like BRD’s, Use Cases, Software Requirements Specifications (SRS) and setting up design sessions with Backend Developers and make sure all team members are in Sync with the Business expectations and engage all stakeholders throughout the Project Life cycle.
Environment: HADOOP Eco-System, JAVA,.NET, Agile, MS Office, Cloud Computing.
Confidential - Atlanta, GA
Big Data Consultant
Responsibilities:
- HADOOP Cluster Implementation Strategy/Big Data /WMOS Solutions /SOA Architecture
- Software Enhancements/Business Intelligence/ Data Migration/SHAREPOINT.
- Big Data Consultant for Big Data downstream projects.
- Experience deploying best practices and methodologies to define Hadoop (Cloudera) infrastructure to roll out releases into production.
- Clickstream Analysis out the web logs to do Basket Analysis and create meaningful and actionable insights like consumer buying patterns/predictable analytics to prevent customer churn/pre-empt competitors by bringing the most desired items to store shelves.
- Targeted Marketing - Hadoop framework helped to deploy to increase sales volume and conversion rates, reduce stock-outs and lead times, and more effectively compete with alternative web-based E-Commerce options like HD.Com.
- Hands-on experience with MapReduce Jobs on Hadoop based distributed systems (e.g. MapReduce, Hive, Hbase, Pig, and Flume) Using JAVA program extensively.
- Responsible for writing MapReduce programs. Import and export data into HDFS from other RDMS using Sqoop/Hive. Involved in loading data from UNIX file system to HDFS
- Expert level experience architecting, building, maintaining, and performance tuning and Enterprise grade Hadoop commercial distribution-Cloudera CDH
- Worked with large data sets, automate data extraction, build monitoring/reporting and high-value, and automated clickstream Analysis and offering Business Intelligence solutions.
- Build monitoring solution(s) for the Big Data infrastructure to understand the health of the infrastructure.
- Developed data architectural strategies at the modeling, design and implementation stages to address product requirements
- Setting up the Hadoop Clusters & HDFS/MapReduce Jobs. Ability in Administering, Installation, configuration, troubleshooting, Security, Backup, Performance Monitoring and Fine-tuning of HADOOP Clusters. Experience in using Scoop, Zookeeper and Cloudera Manager. Good Knowledge on Hadoop Cluster architecture and monitoring the clusters and Huge Data sets integration.
- Hadoop MapReduce programs helped for better understanding of customer basket size and structure, real-time access to inventory levels, and insight into trade and promotion effectiveness to help refine future advertising campaigns and align inventory levels by location. Further helps to get an updated view of order inventory to enable real-time pricing tools which in corporate projections and actual behavior to maximize high-fixed, low variable cost, inventory.
- The output results from Hadoop jobs helped to adjust the content to each user, to attract and retain customers, and thereby improve sales/usage volume and stop Customer Churn.
Environment: Pentaho, Teradata Aster, Vertica, Splunk, Talend, Tableau.
Confidential - Cumberland, RI
Big Data Consultant / Hadoop Consultant
Responsibilities:
- Project: Big Data- HADOOP - POC (Proof of Concept) DATA Analytics & Third Party DATA Integration.
- USE Cases: Fraud Detection, prevent Customer Churn, Confidential t Clinical notes Integration.
- Role & Responsibilities: Started my Hadoop journey here as I got involved in Setting up of a New POC - in Amazon EMR’s and after successful POC, we eventually collaborated with Cloudera distribution technicians to initiate a Multi-Node Cluster setup, Configure and Test for Development and production.
- The data is fed into ETL and then processed using Hive to de-normalize and aggregate the disparate data sources. The customer profiles are categorized and product profiles are built using Pig. The processed data is then moved into Hive for real-time access using a REST-based API.
- Importing and exporting data into HDFS from RDBMS/ Hive using Sqoop.
- Worked on installing cluster, commissioning & decommissioning of data node, name node recovery, capacity planning, and slots configuration. Setup Hadoop cluster on Amazon EC2 using whirr for POC.
- Responsible to manage data coming from different sources. Installed and configured Hive and also written Hive UDFs.
Confidential -Minneapolis, MN
Embedded Systems Engineer
Responsibilities:
- Involved as Systems Engineer to perform SYSTEMS INTEGRATION using TIBCO Tool for this newly merged company- Confidential + Medco.
- Conducted GAP Analysis /Impact Analysis /Pricing Analysis and flushing out High level Requirements from the Business users and other Stakeholders involved in this Initiative.
Confidential - Horsham, PA
Systems Integration Analyst
Responsibilities:
- Roles & responsibilities: System Integration using TIBCO. Third party software integrations. Created Dashboards and scheduled automatic refresh and email.
- Responsible for all dashboard, metrics & analytics for global operations.
- Analyzed the needs of 1000+ users and updated requirements. Identify Risks and involved management in decision making.
Confidential
Business Systems Analyst
Responsibilities:
- Project: Data Integration and Migration of Legacy Apps.
- Responsible for creating and reviewing business requirements, functional specifications, project schedules, technical documentation and test plans.