We provide IT Staff Augmentation Services!

Big Data Architect Resume

SUMMARY:

  • Working as Enterprise Hadoop - Big data Architect for Federal organization ( USPTO)
  • Experienced Hadoop Architect with ability to train and lead the Big data Hadoop teams.
  • Extensive experience and knowledge in Cloud and Hadoop architecture and designing.
  • Certified Hadoop professional for Cloudera and Hortonworks and MAP-R
  • Experienced Architect for building solutions for data as a Service (D-A-A-S) using Spark eco-systems.
  • Experience in implementing secured data flows using Apache NIFI and HDF.
  • Experience end to end implementation of BI solutions using Tableau, Jasper Reports, QlickView, Pentaho, Report server.
  • Extensive experience in agile methodology and certified Scrum Master.
  • Passionate about data and focused on building next generation Big Data applications.
  • Extensive knowledge in providing business solutions using Hadoop technologies.
  • Experience in Apache SPARK, HADOOP 2.0 Ecosystems, Kafka, Scala java and Python and R.
  • Over 15 years of total experience in Architect design and developing enterprise applications using Hadoop, Java J2EE technologies and NOSQL databases.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Map Reduce, TEZ, YARN, HDFS, HBase, Casandra, NIFI, Presto, Oozie, Hive, Sqoop, Pig, and Flume, ELK , NiFI, Mango DB, Amazon Dynomo DB, LDAP, IDM.
  • Experience in managing and troubleshooting Hadoop related issues.
  • Extensive knowledge and experience in ETL tools.
  • Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.
  • Knowledge in job/workflow scheduling and monitoring tools like Oozie & Zookeeper.
  • Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the ecosystem in Hadoop.
  • Extensive experience in Requirements gathering, Analysis, Design, Reviews, Coding and Code Reviews, Unit and Integration Testing.
  • Experience in using different applications development frameworks like Hibernate, Struts, and spring for developing integrated applications and different light weight business components.
  • Experience in developing and designing Web Services (SOAP and Restful Web services).
  • Good knowledge and working experience in Jasper Reports, and XML related technologies.
  • Experience in using Java, J2EE technologies for reusing most effective and efficient strategies.
  • Extensive experience in writing SQL quires for Oracle, Hadoop and DB2 databases using SQLPLUS. Hands on experience in working with oracle (9i/10g/11g), DB2, NoSQL, MySQL and knowledge on SQL Server.
  • Excellent technical, logical, code debugging and problem-solving capabilities and ability to watch the future environment, the competitor and customers probable activities carefully.
  • Proven ability to work effectively in both independent and team situations with positive results. Inclined towards building a strong team/work environment, and have the ability to accustom to the latest technologies and situations with ease.

TECHNICAL SKILLS:

Hadoop: SPARK, HDFS, Map-Reduce, TEZ, Pig, Hive, HBase, Flume, Sqoop, Zoo Keeper and Oozie, Storm, Kafka, ELK, GraphX, NIFI

Hadoop Cluster: Cloudera CDH4/5, Hortonworks 2.6, Map -R

IDE s: Eclipse and Intellij, Pycharm.

NoSQL Databases: HBase, MongoDB, Cassandra, Amazon Dynomo DB

Frameworks: MVC, Struts, Hibernate and Springs

JEE Technologies: JSP, Servlets, Spring, Strut JDBC, EJB, JMS, SOAP, Restful Webservices, IBatis

BI Tools: Tableau, QlickView, Jasper Reports, Pentaho, Report Server

Programming languages: C, Java, Scala, Python, R and Linux shell scripts

SQL Databases: Oracle 9i,10g,11g, MySQL, DB2 and MS-SQL Server

Web Servers: Web Sphere, Weblogic, JBoss, and Apache Tomcat

Web Technologies: HTML, CSS3, JavaScript and jQuery, Ajax

PROFESSIONAL EXPERIENCE:

Confidential

Big Data Architect

Responsibilities:

  • Understanding the business requirements and needs and drawing the road map for Big data initiatives for Confidential ’s customers.
  • Responsible for building scalable distributed data solutions using Hadoop and Spark on Confidential ’s Cloud with MAS platform.
  • Playing key role in design, develop and implementing Confidential ’s home analytics (HAL) products using Hadoop and Spark.
  • Responsible for Cluster maintenance on Confidential ’s Cloud, commissioning and de-commissioning cluster nodes, Cluster Monitoring and Troubleshooting.
  • Implementation of distributed stream processing and predictive analytics ecosystems.
  • Assisted with automated deployments using Puppet on Confidential ’s Cloud platform.
  • Designed and Implemented the real-time analytics and ingestion platform using Lambda architectures based on Kafka, Sqoop, Flume, Hive, Oozie, Cassandra, HBase, Sparks, SparkML, Python, java.

Confidential

Hadoop Lead

Responsibilities:

  • Evaluated suitability of Hadoop and its ecosystem to the above project and implemented applications to eventually adopt them to benefit from the Bigdata Hadoop initiative.
  • Extracted the needed data from the different data sources into HDFS and Bulk Loaded the cleaned data into HBase.
  • Written the Map Reduce programs, Hive UDFs in Java where the functionality is too complex.
  • Involved in loading data from LINUX file system to HDFS
  • Developed HIVE jobs for the analysis, to categorize different items.
  • Implemented FLUME to handle the real-time log processing for attribution reports.
  • Sentiment Analysis on reviews of the products on the client’s website.
  • Exported the resulted sentiment analysis data to Tableau for creating dashboards
  • Used Map Reduce JUnit for unit testing.
  • Maintained System integrity of all sub-components (primarily HDFS, MR, HBase, and Hive).
  • Reviewing peer table creation in Hive, data loading and queries.
  • Monitored System health and logs and respond accordingly to any warning or failure conditions.
  • Responsible to manage the test data coming from various sources.
  • Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Flume, Cloudera, Oozie, MySQL, UNIX, Core Java.

Confidential

Hadoop Developer

Responsibilities:

  • Experience in Developing the Hive queries for the transformations, aggregations and Mappings on the Customer Data.
  • Worked on importing and exporting data into HDFS and Hive using Sqoop.
  • Worked on analyzing/transforming the data with Hive and Pig.
  • Developed map reduce programs for applying business rules on the data
  • Developed and executed Hive Queries for DE-normalizing the data.
  • Automated workflow using Shell Scripts.
  • Performance Tuning on Hive Queries.
  • Involved in migration of data from one Hadoop Cluster to the Hadoop Cluster.
  • Worked on configuring multiple Map Reduce Pipelines, for the new Hadoop Cluster.
  • Performance tuned and optimized Hadoop clusters to achieve high performance.
  • Implemented schedulers on the Job tracker to share the resources of the cluster for the map reduces jobs given by the users.
Confidential

Senior Java Consultant

Responsibilities:

  • Involved in Architectural decisions.
  • Involved in High level and low level technical design.
  • Involved in Coding using Java J2EE and mentoring junior developers on complex issues.
  • Extensively used J2EE design pattern.
  • Wrote test cases for unit testing using Junit and involved in integration testing.
  • Used Struts framework for application front end development.
  • Integrated application with Spring framework to implement dependency injection.
  • Application deployments in WebSphere server.
  • Involved in code builds using SVN and Maven.
Confidential

Senior Java Consultant

Responsibilities:

  • Interaction with Business Analysts and analyze the requirements.
  • Involved in design and coding using Java J2EE technologies.
  • Wrote test cases for unit testing using Junit and involved in integration testing.
  • Used Struts framework for application front end development.
  • Used JavaScript and jQuery for client-side validation and interactive web pages.
  • Integrated application with Spring framework to implement dependency injection.
  • Application deployments in Tomcat server.
  • Involved in code builds using SVN and Maven.

Confidential

Java Technical Lead

Responsibilities:

  • Interacted with clients to collect business requirements, analyze and design the system.
  • Utilized the SOA architecture.
  • Developed prototypes of the application in coordination with the offshore team for business approval.
  • Involved in design and coding using Java J2EE technologies.
Confidential

Java Technical Lead

Responsibilities:

  • Gathered and analyzed requirements from the client.
  • Utilized the SOA architecture.
  • Coordinated design and development with the offshore team.
  • Used complex workflow features using VITRIA business ware.
  • Used Jakarta Struts as the MVC framework to design the complete Web tier.
  • Involved in end-to-end application development using J2EE, Struts and deployment using JBOSS, WebLogic application servers.
Confidential

Sr. Java Developer

Responsibilities:

  • Involved in development of different J2EE components like EJBs, Client jars, Web Modules, and Application EAR modules.
  • Involved in deployment of application in Jboss Server.
  • Used Apache’s Jakarta Struts as the MVC framework to design the Web tier.
  • Developed several Action Servlet, Action, Form Bean, and Java Bean classes to implement the business logic using Struts framework.
  • Developed Data Access Object (DAO) patterns to abstract and encapsulate the data access mechanism. Utilized Oracle as the database for Data persistence.
Confidential

Java Developer

Responsibilities:

  • The base architecture was designed as per the MVC architecture using the Front Controller Design pattern based on the application requirements.
  • Used Eclipse for a Java IDE that supported the development and management of Java applications. Developed Business objects utilizing Session and Entity beans.
  • Involved in End to End layers coding.
  • Converted well-designed HTML pages to JSP pages by adding dynamic content to it.
  • Developed the Web Interface using JSP, Servlets, HTML, and CSS.

Hire Now