Bigdata Hadoop Technical Architect Resume
SUMMARY:
- Design, architect, and develop and implements various algorithms to build advanced analytics solutions and PoCs for global customer base, including Time Series analysis, Recommendation, Fraud Detection, smart meter data Analysis and Real Time Predicative analysis.
- Worked as BigData Architect to design the BigData, Hadoop compatibility with other third party vendors and design the business use cases.
- Worked for a publishing giant performed BigData strategy and project level analysis to determine path forward for Predictive Maintenance product and general Data Analytics as a Service environment. Play the role in Technology Selection, Metrics and Management Recommendations.
- Involved in designing and building and establishing Enterprise BigData hub fit into five container database channel to preserve history and valuable data of decades for long time.
- I played a major role as an Architect and Data analyst and Single point of Contact for the BigData/Hadoop Technology. I’ve also taken the responsibility for the building BigData Roadmap, Requirements gathering process, Requirement design document, and project plan deliverables to business users/stakeholders.
- I involved in delivering Software evaluation matrix for all competitive vendors in BigData Hadoop enterprise solution.
- Delivered design to data warehousing and business intelligence data needs including dimensional modeling, effective capture of history, and consideration of techniques to load and access large amounts of data in read - intensive applications
- I designed, planned, managed and successfully implemented the BigData Hadoop platform in Client location. I deal with customer complaints with a calm demeanor. I am very good at handling difficult situations with customers. I understand how to listen to customers and extract those details which make a big difference when dealing with our clients. Successfully designed, delivered, mentored employees who have no idea about the technology.
- Successfully led the team and provided the best Roadmap solutions to the Client. I have an imaginative personality and am very resourceful in times of need. When a major problem arises, I use creative problem solving to look at different sides of an issue. I think outside the box when crafting solutions.
- Delivered the Roadmaps/Functional/Technical approach/Test Scenarios and Use cases to Client and delivered the right product and met Client expectations.
- Lead design excellence, thrives in an environment of collaboration and demonstrates a high degree of creativity and entrepreneurial spirit. I constantly search for new ideas and ways to improve efficiency.
- Achieve excellent at keeping written information about my Client assignments and projects and keep updating with supervisors with weekly meetings.
- Lead Architectural Design in BigData, Hadoop projects and provide for a designer that is an idea-driven visionary with a high level of design skills of the complex nature of science and technology projects. Published white papers that could help practice development Architecture Hadoop technology solutions.
- Delivered enterprise level business use cases and requirements based on BigData Hadoop solution enterprise level. Driven the high priority on fascinating BigData Hadoop Proof of concept using and publishing Horton works as a result.
- Provided software evaluation matrix for all major competitor vendors who play a major role in the market to provide an enterprise level BigData solution
- Given one of my research and analyze for storing all archived emails from decades into BigData Hadoop for future recovery. The design approved by the client as a BigData big solution
- Delivered exceptional quality across engagement work and practice development initiatives. Actively seek feedback on quality on a monthly basis from managers and engagement leadership.
- Extensively involved in building a cluster setup for both Horton works and Cloudera enterprise level.
- Extensively involved in planning, executing POCs to find how BigData, Hadoop can help to solve the pain levels and performance events to process, storing, analyzing and securing sensitive data of all applications struggling with high intensity, variable, veracity, value of data.
- Involved in technical POC’s to know data integration from Tableau, SAS data, Spotfire, Informatica to BigData Hadoop clusters for process data in either direction
- Experienced in distributed systems to leverage CDH5.2.0, HDFS, MapReduce, analyzed performance bottlenecks and recommended tuning optimizations.
- Proficient in Apache Hadoop ecosystems PIG, FLUME, Hbase, Zookeeper, Hive, impala, SQOOP, Solr, Spark, KAFKA, Apache Tika, strong understanding of HDFS architecture.
- Implemented Proofs of Concept on Hadoop stack and different BigData analytic tools, migration from different databases (i.e. Teradata, Oracle, SASdata, PostgreSQL and MYSQL) to Hadoop.
- Experience in Design, Develop, install, configure and troubleshoot a variety of search applications and client configurations using with Eco Systems in Hadoop following with SDLC process.
TECHNICAL SKILLS:
BigData Ecosystem: CDH5.2.0, HDP2.4, HDFS, Spark, Hbase, Zookeeper, Hive, TEZ, Impala, Solr, Sqoop, Oozie, Kafka, NiFi, Flume, Ambari, Cloudera manager
Programming Languages: Java basic, SQL,PL/SQL,UNIX/Linux, Shell Scripts
Web Technologies: HTML, XML, JavaScript, JSON,REST API
Database &Tools: Oracle, DB2, MySQL,Postgress,Hbase, Tableau, Alteryx
Application Server: Apache Tomcat 5.5.0
IDE s, Utilities & Web: Eclipse, HTML,CSS, Java Script
Operating Systems: LINUX /Ubuntu, Windows, UNIX
Methodologies: Agile, SDLC, Scrum, UML, OOP
Protocols: TCP/IP, HTTP, SOAP and HTTPS
PROFESSIONAL EXPERIENCE:
Confidential
BigData Hadoop Technical Architect
Responsibilities:
- As a consultant my role and responsibility involve to bring out BigData Hadoop Platform as an extensively solution platform.
- I played a major role as a Solution Architect and Data analyst and Single point of Contact for the BigData/ Hadoop Technology. I’ve also taken the responsibility for the building BigData Roadmap, Requirements gathering process, Requirement design document, project plan deliverables to business users/stakeholders and Implementation.
- I involved in delivering Software evaluation matrix for all competitive vendors in BigData Hadoop enterprise solution.
- I designed, planned, and managed around the clock and successfully implemented the BigData Hadoop platform in Client location. Successfully designed, delivered, mentored employees who have no idea about the technology.
- Successfully led the team and provided the best Roadmap solutions to the Client. I have an imaginative personality and am very resourceful in times of need. When a major problem arises, I use creative problem solving to look at different sides of an issue. I think outside the box when crafting solutions.
- Delivered the Roadmaps/Functional/Technical approach/Test Scenarios and Use cases to Client and delivered the right product and met Client expectations.
- Lead design excellence, thrives in an environment of collaboration and demonstrates a high degree of creativity and entrepreneurial spirit. I constantly search for new ideas and ways to improve efficiency.
- Lead Architectural Design in BigData, Hadoop projects and provide for a designer that is an idea-driven visionary with a high level of design skills of the complex nature of science and technology projects.
- Delivered enterprise level business use cases and requirements based on BigData Hadoop solution enterprise level. Driven the high priority on fascinating BigData Hadoop Proof of concept using and publishing Horton works as a result.
- Provided my expertise and analyze skill and supervision of SAS Visio installation for the next phase of analyzing at the client.
- Provided software evaluation matrix for all major competitor vendors who play a major role in the market to provide an enterprise level BigData solution
- Given one of my research and analyze for storing all archived emails from decades into BigData Hadoop for future recovery. The design approved by the client as a BigData big solution
- Delivered exceptional quality across engagement work and practice development initiatives. Actively seek feedback on quality on a monthly basis from managers and engagement leadership.
- Take the lead in identifying issues on engagements and proposing workable solutions. Establish strong working relationships with clients, and consolidate position as a key contact for clients.
- Identified the challenges and complexities of engagement work and proposing workable solutions after a thorough analysis.
- Presented the Leadership role in challenging engagements, guide and mentor the team through the complex analysis and in delivering quality work products.
Environment: HDP2.4, HDFS, Hbase, Flume, Sqoop, Hive, spark, impala .Solr, TEZ, Oracle database, PostgreSQL, Toad, Sqlserver, Windows 2012 R2,NPI smart meter data, Crystal report.
ConfidentialBigdata Hadoop Architect
Responsibilities:
- Determine feasibility requirements, compatibility with current system, and system capabilities to integrate new acquisitions and new business functionalities.
- Extensive experience with BigData Analytics, Data Warehousing Applications, Business architecture methodologies and process modeling in a large organization across multiple functional units.
- Demonstrated success in full lifecycle of BigData implementations. Strong business acumen with understanding of financial business operation flows.
- Established system of record based consumption patterns (RDW’s marts & formal reporting functions including financial, regulatory and other).
- POC- Hadoop cluster with echo systems and HBASE installation with replication using rich documents like (PDF, DOC, TXT, Excel), unload/load into HBASE. Enabled features in Solr like Highlighting, Faceting, Suggested, indexing.
- Independently formulate detailed program specifications using structured data analysis and design methodology. Prepare project documentation when needed.
- Independently code new programs and design Tables to load and test the program effectively for the given POC’s using with BigData/Hadoop along with the following technical skills like Hive, HDFS, Impala, Hue, Solr, Json Scripts, Spark, Cloudera Manager, Cloudera Navigator to deliver complex systems issues or changes.
- Develop detailed application designs and specifications for computer applications. May assist in technical lead capacity during project development in POC’s phase.
- Write documentation describing program modifications, logic and corrections. Oversee development of user manuals and operating procedures. Provide technical assistance to resolve operating issues.
- Extracting, transferring and loading the data from different sources to build the right solutions for Hadoop Projects.
- Designed and analyzing the Business use cases to provide the right solutions for all the POC’s used in Hadoop Projects.
- The above responsibilities are complex and involve the theoretical and practical application and highly specialized knowledge.
- Replaced default Derby metadata storage system for Hive with MySQL system. Executed queries using Hive and developed Map-Reduce jobs to analyze data.
- Developed Pig Latin scripts to pull the information from the web server output files to load into HDFS. Built up the Pig UDF's to preprocess the information for analysis.
- Design, Develop, install, configure and troubleshoot a variety of search applications and client configurations using with Eco Systems in Hadoop (Hive/Solr/SASGrid).
- Developed Hive queries for the analysts. Involved in loading data from LINUX and UNIX file system to HDFS.
- Supported in setting up QA environment and updating configurations for implementing scripts with Pig.
Environment: CDH5.2.0, HDFS, Hbase, Flume 1.4, Sqoop 1.4.3, Hive 0.13.0, spark,impala.Solr1.4,apache Tika, python, UBUNTU( Linux),Eclipse Juno, XML, JSON, Netezza, Enterprise data hub.
ConfidentialBigData Analyst
Roles & Responsibilities:
- Converting the existing relational database model to Hadoop ecosystem.
- Generate datasets and load to HADOOP Ecosystem
- Design technical architectural workflow
- Optimize the Hive queries with proper algorithms and build customer attributes using HIVE.
- Integrating the Hive queries with OOZIE
- Compare the Hive queries output to existing data model outputs.
- Generate datasets and load to HADOOP Ecosystem
- POC on data Ingestion with different tools.
- Follow agile methodology for the entire project.
- Designed and developed MapReduce programs with Java.
- Develop UDF (java) and UDAF (java)
- Orchestrate hundreds of HIVE queries using Oozie workflows.
- Analyze customer patterns based on the attributes.
- Follow agile methodology for the entire project.
- Conduct scrum calls every day. Prepare technical design documents, detailed design documents.
Environment: Hadoop 0.2.0 MR1, HDFS, Hbase, Flume 1.4, Sqoop 1.4.3, Hive 0.7.1, Java 1.6, Linux, Spring 3.2.3, Eclipse Juno, XML, JSON.
ConfidentialTechnical Lead
Roles & Responsibilities:
- Called for in the blueprint of the Technical Business Design Documents based on the Conceptual System Design.
- Meet the major office to manage the critical application during knowledge transition and shadowing and reverse shadowing time.
- Responsible for getting up the detail planning and system appreciation document preparation of several applications and all its subsets.
- Perform business area analysis, including subjects of business plans in order to enforce the theory and rules of performance improvement to customer offices
- Recognize significant problems and opportunities in clients’ operations and give understanding of clients’ systems and procedures, overall business operations and industries in the current.
- Responsible for utilizing technological analysis and design principles to formulate detailed application plans and procedures in order to implement clients' requests for new or modified functionalities
- Analyze clients' requests for new or modified applications through interviews and design sessions
- Design, produce and test proposed enhancements with client interaction for verification
- Developed software programs using JCL, COBOL and DB2.
Environment: Z/OS, DB2, COBOL, JCL, CICS, VSAM, SPUFI, QMF, SQL, ACF2, XPEDITOR
ConfidentialSenior technical Analyst
Roles & Responsibilities:
- Called for in the blueprint of the Technical Business Design Documents based on the Conceptual System Design. Coded complex Cobol/DB2 modules that do dynamic allocation of the files.
- Provided escalated support by investigating and solving complicated business application issues and their associated organizations.
- Developed an expert understanding of technical and business process flow of applications and provided recommendations for improvement.
- Provided with appropriate communication, facilitate bringing other parties together, and complete post resolution reporting for high severity incidents
- Participated in Development efforts, provide input for future requirements, and inform the Service Desk or others of the Release Notes and any known events.
- Extensively worked on Problem Investigation, Analysis and development of certification for the existing or new modules.
- Setup, configure, maintain and monitor assigned business application(s) and related systems and worked in Mainframe transaction facility protocols.
- Working with clients to obtain the occupation requirements and enhancements in the organization.
- Involved in production batch support like scheduling the jobs, restarting, fixing abends and bypassing the cases. Setting up of bugs & ensuring timely and defect free delivery.
- Coordinated with interface teams to clear the technical/business doubts.
Environment: OS/390, TSO/ISPF, VSAM, COBOL, JCL, DB2, PLATINUM, CMTS, SPUFI, Toad, JIRA, SQL Explorer, MS-Office, MS Project.
ConfidentialProgram Analyst
Role & Responsibilities:
- Responsible for implementing, customizing, and integrating components of the client (travelers) application
- Design, produce and test proposed enhancements with client interaction for verification
- Monitor operation and functionality throughout the execution process by testing applications to ensure optimum user benefits, design and configure application modifications and enhancements as necessary
- Plan and create web front-end applications to integrate with host-side operations
- Implement the integration and customization of customer-specific system packages
- Provide first level production support post “get live” operations
- Integrate and program middleware concepts
Environment: Z/OS, DB2, COBOL, JCL, VSAM, SPUFI, QMF, SQL, ACF2, MICROSOFT OFFICE (WORD, EXCEL, ACCESS).
ConfidentialMainframe Analyst
Roles & Responsibilities:
- Research, update and maintain quality testing measures and routines.
- Assist in the planning, creation and control of the test environment(s).
- Identify, collect and create test data.
- Facilitate and participate in structured walk-through and peer critiques.
- Take part in the coordination and implementation of system and assembly testing.
- Inform Test Architect and Team Lead of any events that may bear upon the schedule, budget, or quality of the ware and the testing process.
- Validate fixes, execute test scripts, as well as record problems and events in accordance with the project's problem and issue management plans
- Project management, testing and reporting of outcomes
- Document all the testing results and maintain ABEND logs.
- Assist and coordinate with the new resources in understanding the nature of work and the testing procedures.
- Created conceptual design, test approach, test plan, test plan and test scripts manually and Mercury Quality Centre and CMTS. Setup and manage test environments and test data based on the exam requirements. Coordinating with onshore & Client communication.
Environment: Z/OS, DB2, COBOL, JCL, VSAM, SPUFI, QMF, SQL.