We provide IT Staff Augmentation Services!

Technical Architect/datascientist Resume

4.00/5 (Submit Your Rating)

SUMMARY:

  • Certified Hadoop professional for Cloudera and Hortonworks.
  • Experienced Hadoop Architect/Data Scientist with ability to train and lead the Big data Hadoop teams.
  • Architected highly - scalable, high performance stream-based data processing, predictive analytics system.
  • Extensive experience and knowledge in Cloud, Hadoop and Spark architect and designing.
  • Specialized in design and development of Big Data Technologies in highly scalable end-to-end Hadoop Infrastructure.
  • Designed and Implemented the real-time analytics and ingestion platform using Lambda architecture.
  • Extensive Knowledge in AWS Cloud platform, Data modeling.
  • Passionate about data and focused on building next generation Big Data applications.
  • Extensive knowledge in providing business solutions using Hadoop technologies.
  • Experience in Apache SPARK, HADOOP 2.0 Ecosystems, Scala java and Python and R.
  • Over 15 years of total experience that in Architect design and developing enterprise applications using Java J2EE technologies and NOSQL databases HBASE, MangoDB, Casandra.
  • Hands on experience in installing, configuring, and using Hadoop ecosystem components like Map Reduce, YARN, HDFS, HBase, Casandra, Oozie, Hive, Sqoop, Pig, and Flume.
  • Experience in working with large scale Hadoop environments build and support including design, configuration, installation, performance tuning and monitoring.
  • Experience in importing and exporting terabytes of data using Sqoop from HDFS to Relational Database Systems and vice-versa.
  • Experience in architecting Hadoop clusters using major Hadoop Distributions
  • Experience in managing and troubleshooting Hadoop related issues.
  • Extensive knowledge in ETL tools.
  • Experience in installation, configuration, management and deployment of Big Data solutions and the underlying infrastructure of Hadoop Cluster.
  • Knowledge in job/workflow scheduling and monitoring tools like Oozie & Zookeeper.
  • Experience in analyzing data using HIVE QL, PIG Latin and custom Map Reduce programs in JAVA . Extending Hive and PIG core Functionality by using custom User Defined Functions.
  • Worked with application teams to install operating system, Hadoop updates, patches and version upgrades as required.
  • Hands on experience in virtualization and worked on VMware Virtual Center
  • Experience in designing, developing and implementing connectivity products that allow efficient exchange of data between our core database engine and the ecosystem in Hadoop.
  • Extensive experience in Requirements gathering, Analysis, Design, Reviews, Coding and Code Reviews, Unit and Integration Testing.
  • Experience in using different applications development frameworks like Hibernate, Struts, and spring for developing integrated applications and different light weight business components.
  • Experience in developing service components.
  • Experience in developing and designing Web Services (SOAP and Restful Web services).
  • Experience in developing Web Interface using Servlets, JSP and Custom Tag Libraries.
  • Good knowledge and working experience in XML related technologies.
  • Experience in using Java, JEE, J2EE design Pattern, for reusing most effective and efficient strategies.
  • Expertise in using IDE like WebSphere (WSAD), Eclipse, NetBeans, WebLogic Workshop.
  • Extensive experience in writing SQL quires for Oracle, Hadoop and DB2 databases using SQLPLUS. Hands on experience in working with oracle (9i/10g/11g), DB2, NoSQL, MySQL and knowledge on SQL Server.
  • Extensive experience in using SQL and PL/SQL to write Stored Procedures, Functions and Triggers.
  • Excellent technical, logical, code debugging and problem-solving capabilities and ability to watch the future environment, the competitor and customers probable activities carefully.
  • Proven ability to work effectively in both independent and team situations with positive results. Inclined towards building a strong team/work environment, and have the ability to accustom to the latest technologies and situations with ease.

TECHNICAL SKILLS:

Hadoop: SPARK, HDFS, Map-Reduce, TEZ, Pig, Hive, HBase, Flume, Sqoop, Zoo Keeper and Oozie, Storm, Kafka, ELK, Graphx

Hadoop Cluster: Cloudera CDH4/5, Hortonworks

Deployment Frameworks: Puppet

BI Tools: Tableau

IDE s: Eclipse and Net beans

NoSQL Databases: HBase, MongoDB, Cassandra

Frameworks: MVC, Struts, Hibernate and Springs

JEE Technologies: JSP, Servlets, Spring, Strut JDBC, EJB, JMS, SOAP, Restful Webservices, IBatis, Hibernate

Programming languages: C, Java, Scala, Python,R and Linux shell scripts

SQL Databases: Oracle 9i,10g,11g, MySQL

Web Servers: Web Sphere, Weblogic,JBoss, and Apache Tomcat

Web Technologies: HTML, CSS3, JavaScript and JQuery

PROFESSIONAL EXPERIENCE:

Confidential

Technical Architect/DataScientist

Responsibilities:

  • for Confidential team spread across distinct locations in US, UK and India .
  • Understanding the business requirements and needs and drawing the road map for Big data initiatives for Confidential ’s customers.
  • Responsible for building scalable distributed data solutions using Hadoop and Spark on Confidential ’s Cloud with MAS platform.
  • Playing key role in design, develop and implementing Confidential ’s home analytics (HAL) products.
  • Responsible for Cluster maintenance on Confidential ’s Cloud, commissioning and de-commissioning cluster nodes, Cluster Monitoring and Troubleshooting.
  • Orchestrate hundreds of HIVE Jobs using Oozie workflows.
  • Implementation of distributed stream processing and predictive analytics ecosystems.
  • Assisted with automated deployments using Puppet on Confidential ’s Cloud platform.
  • Designed and Implemented the real-time analytics and ingestion platform using Lambda architectures based on kafka, Sqoop, Flume, Hive, Oozie, Cassandra, Hbase, SparkSQL, SparkML, Python, java.

Confidential

Hadoop Lead

Responsibilities:

  • Evaluated suitability of Hadoop and its ecosystem to the above project and implemented various proof of concept (POC) applications to eventually adopt them to benefit from the Big Data Hadoop initiative.
  • Estimated Software & Hardware requirements for the Name Node and Data Node& planning the cluster.
  • Extracted the needed data from the server into HDFS and Bulk Loaded the cleaned data into HBase.
  • Written the Map Reduce programs, Hive UDFs in Java where the functionality is too complex.
  • Involved in loading data from LINUX file system to HDFS
  • Develop HIVE queries for the analysis, to categorize different items.
  • Designing and creating Hive external tables using shared meta-store instead of derby with partitioning, dynamic partitioning and buckets.
  • Given POC of FLUME to handle the real-time log processing for attribution reports.
  • Sentiment Analysis on reviews of the products on the client’s website.
  • Exported the resulted sentiment analysis data to Tableau for creating dashboards
  • Used Map Reduce JUnit for unit testing.
  • Maintained System integrity of all sub-components (primarily HDFS, MR, HBase, and Hive).
  • Reviewing peer table creation in Hive, data loading and queries.
  • Monitored System health and logs and respond accordingly to any warning or failure conditions.
  • Responsible to manage the test data coming from different sources.
  • Involved in scheduling Oozie workflow engine to run multiple Hive and pig jobs
  • Weekly meetings with technical collaborators and active participation in code review sessions with senior and junior developers.
  • Created and maintained Technical documentation for launching Hadoop Clusters and for executing Hive queries and Pig Scripts
  • Involved unit testing, interface testing, system testing and user acceptance testing of the workflow tool.

Environment: Apache Hadoop, HDFS, Hive, Map Reduce, Java, Flume, Cloudera, Oozie, MySQL, UNIX, Core Java.

Hadoop Developer

Confidential

Responsibilities:

  • Worked on the Hadoop File System Java API to develop or Compute the Disk Usage Statistics.
  • Experience in Developing the Hive queries for the transformations, aggregations and Mappings on the Customer Data.
  • Worked on importing and exporting data into HDFS and Hive using Sqoop.
  • Worked on analyzing/transforming the data with Hive and Pig.
  • Developed map reduce programs for applying business rules on the data
  • Developed and executed Hive Queries for DE-normalizing the data.
  • Automated workflow using Shell Scripts.
  • Performance Tuning on Hive Queries.
  • Involved in migration of data from one Hadoop Cluster to the Hadoop Cluster.
  • Worked on configuring multiple Map Reduce Pipelines, for the new Hadoop Cluster.
  • Performance tuned and optimized Hadoop clusters to achieve high performance.
  • Implemented schedulers on the Job tracker to share the resources of the cluster for the map reduces jobs given by the users.
  • Worked on Integration of Hiveserver2 with Tableau.

Environment: Hadoop, Map Reduce, HDFS, Hive, Java, Hadoop distribution of Cloud era, Pig, Hbase, Linux, XML, Java 6, Eclipse, Oracle 10g, PL/SQL.

Senior Java Consultant

Confidential

Responsibilities:

  • Interaction with Nationwide Legal Business Analysts and analyze the requirements.
  • Involved in Architectural decisions.
  • Interaction with team members and participation in technical meetings .
  • Involved in High level and low level technical design.
  • Provide an estimation of hours for development.
  • Daily stand up meeting with team member and provide the work progress.
  • Involved in Coding and mentoring junior developers on complex issues.
  • Extensively used J2EE design pattern.
  • Wrote test cases for unit testing using Junit and involved in integration testing.
  • Used Struts framework for application front end development.
  • Integrated application with Spring framework to implement dependency injection.
  • Application deployments in Websphere server.
  • Involved in code builds using SVN and Maven.

Senior Java Consultant

Confidential

Responsibilities:

  • Interaction with Business Analysts and analyze the requirements.
  • Interaction with team members and participation in technical meetings.
  • Involved in High level and low level technical design.
  • Provide an estimation of hours for development.
  • Daily stand up meeting with team member and provide the work progress.
  • Involved in Coding and mentoring junior developers on complex issues.
  • Extensively used J2EE design pattern.
  • Wrote test cases for unit testing using Junit and involved in integration testing.
  • Used Struts framework for application front end development.
  • Used JavaScript and Jquery for client side validation and interactive web pages.
  • Integrated application with Spring framework to implement dependency injection.
  • Application deployments in Tomcat server.
  • Involved in code builds using SVN and Maven.

Confidential

Java Technical Lead

Responsibilities:

  • Interacted with clients to collect business requirements, analyze and design the system.
  • Designed various UML Diagrams like Class diagrams and Sequence Diagrams using Rational Rose.
  • Provided an estimate of hours for develoment.
  • Utilized the Agile methodology for several projects.
  • Utilized the SOA architecture.
  • Developed prototypes of the application in coordination with the offshore team for business approval.
  • Developed JUnits and code builds using Maven.
  • Extensively used Struts with Tiles to build the presentation layer.
  • Utilized Struts Validation Framework for client side validations.
  • Extensively used J2EE Design patterns.
  • Utilized Hibernate and Spring to build persistent and reliable application modules.
  • Used iBatis as an ORM tool for OR mappings and configured Hibernate.cfg.xml and Hibernate.hbm.xml files.
  • Integrated the application with Spring framework to implement dependency injection and provide abstraction between the presentation layer and the persistence layer.
  • Implemented Web Services using XML, WSDL, and SOAP over HTTP.
  • Used JavaScript, AJAX for client side validations and creating interactive web pages.
  • Wrote test cases for Unit Testing using JUnit, and involved in Integration Testing.
  • Developed the application on RAD and deployed the application on IBM WebSphere .
  • Created a process in Eclipse for more efficient coding for all developers.
  • Implemented a logging system for the project using Log4j.
  • Used Subversion as the version control system on Windows.
  • Helped team members to debug issues with the application.
  • Participated actively in the application production launch.
  • Prepared the test case documents for enhancements
  • JUNIT is used for unit testing and prepared JUNIT Test cases document.
  • Participated in code review and involved in integration, unit, functional testing, peer testing and integration testing.

Environment: JDK 1.5/1.4, J2EE, Servlets, Strut, Spring, Hibernate 3/3.5/4.0, HQL, Maven 3.0, JAX-WX, JAXB, XML, XSD, SOAPUI, JQuery, CSS, JUNIT, Oracle 9i/10g, SQL, PL/SQL, Quality Center, SSH shell, SSH Client, Putty, VSS, WAS, Web Sphere, Visual Studio, Microsoft Visio, Microsoft Project, UML, Share point, Windows XP and UNIX.

Java Technical Lead

Confidential

Responsibilities:

  • Gathered and analyzed requirements from the client.
  • Utilized the SOA architecture.
  • Coordinated design and development with the offshore team.
  • Used complex workflow features using VITRIA businessware.
  • Used Jakarta Struts as the MVC framework to design the complete Web tier.
  • Involved in end-to-end application development using J2EE, Struts and deployment using JBOSS, Weblogic application severs.
  • Developed several Action Servlet, Action, Form Bean, and Java Bean classes to implement business logic using the Struts framework.
  • Developed Command objects and Business Entity objects.
  • Developed JAXB objects and consumed Web Services.
  • Used JBoss application server as JMS provider to manage sessions and queues.
  • Developed a Data Access Object (DAO) pattern to abstract and encapsulate the data access mechanism.
  • Utilized Oracle as the database for Data persistence.
  • Wrote ANT scripts to build the web application.
  • Deployed the Java war file on the Development/Test Servers.
  • Used VSS for version control of the code and configuration files.

We'd love your feedback!