Big Data/java technical Consultant Resume
Nyc, NY
SUMMARY:
- Software professional having around 8 years of Industry Experience as a Big Data/Java Technical Consultant.
- In depth understanding/knowledge of Hadoop Architecture and its components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and MapReduce.
- Expertise in writing Hadoop Jobs for analyzing data using MapReduce, Hive and Pig
- Worked on real - time, in-memory tools such as Spark, Impala and integration with BI Tools such as Tableau.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa
- Experienced in extending Hive and Pig core functionality by writing custom UDFs and Map Reduce Scripts using Java & Python.
- Experience of NoSQL databases such as HBase, Cassandra and MongoDB
- Experienced in job workflow scheduling and monitoring tools like Oozie and Zookeeper
- Experienced in developing applications using all Java/J2EE technologies like Servlets, JSP, EJB, JDBC etc.
- Experienced in developing applications using HIBERNATE (Object/Relational mapping framework).
- Experienced in developing Web Services using JAX-RPC, SOAP and WSDL.
- Thorough knowledge and experience of XML technologies (DOM, SAX parsers), and extensive experience with XPath, XML schema, XML SPY.
- Ability to learn and adapt quickly and to correctly apply new tools and technology.
- Knowledge of administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig.
- Experience with developing large-scale distributed applications.
- Experience in developing solutions to analyze large data sets efficiently
- Experience in Data Warehousing and ETL processes.
- Knowledge of Star Schema Modeling, and Snowflake modeling, FACT and Dimensions tables, physical and logical modeling.
- Strong database modeling, SQL, ETL and data analysis skills.
- Strong communication and analytical skills with very good experience in programming & problem solving.
TECHNICAL SKILLS:
- Spark, Spark-Streaming, Spark-SQL, Hadoop, HDFS, Map Reduce, Shark, Hive, Pig, Sqoop, Flume, Kafka, Storm, Oozie, ZooKeeper.
- HBase, Cassandra, MongoDB
- JAVA, J2EE, Spring, Hibernate EJB, Webservices (JAX-RPC, JAXP, JAXM), JMS, JNDI, Servlets, JSP, Jakarta Struts.
- Teradata, MS SQL Server, Oracle, Sybase, SAS, Informatica, Datastage.
- Tableau, Datameer.
- BEA Web Logic, IBM Websphere, Tomcat.
- UML, OOAD.
- HTML, AJAX, CSS, XHTML, XML, XSL, XSLT, WSDL, SOAP
- CVS, SVN, SharePoint, Clear Case, Clear Quest, Win CVS, Junit, MRUnit, Ant, Maven, Log4j, FrontPage
- Eclipse, NetBeans.
- Linux, UNIX, Windows
PROFESSIONAL EXPERIENCE:
Big Data/Java Technical Consultant
Confidential, NYC, NY
Responsibilities:
- Analyzing The Functional Specifications Based On Project Required.
- Worked with technology and business user groups for Hadoop migration strategy.
- Installed & configured multi-node Hadoop Cluster and performed troubleshooting and monitoring of Hadoop Cluster.
- Developed Map Reduce programs using Java to perform various ETL, cleaning and scrubbing tasks.
- Worked on different data formats such as Parquette, AVRO, Sequence File, Map file and XML file formats.
- Ingested data from various data sources into Hadoop HDFS/Hive Tables and managed data pipelines in providing DaaS (Data As Service) to business/data scientists for performing the analytics.
- Worked on real-time data processing using Spark/Storm and Kafka.
- Worked on writing scala programs using Spark for analyzing data.
- Used Cassandra to store billions of records to enable faster & efficient querying, aggregates & reporting.
- Worked on writing CQL queries in retrieving data from Cassandra.
- Used datameer for integration with Hadoop and other sources such as RDBMS (Oracle), SAS, Teradata and Flat files.
- Sqooped data from DB2, Oracle to Hadoop to increase the current retention period of 1 year to 5 years.
- Wrote Hive and Pig Scripts to analyze customer satisfaction index, sales patterns etc.
- Extended Hive and Pig core functionality by writing custom UDFs using Java.
- Orchestrated sqoop scripts, pig scripts, hive queries using oozie workflows.
- Worked on Data Lake architecture to build a reliable, scalable, analytics platform to meet batch, interactive and on-line analytics requirements.
- Integrated Tableau with Hadoop data source for building dashboard to provide various insights on sales of the organization.
- Worked on Spark in building BI reports using Tableau. Tableau was integrated with Spark using Spark-SQL/Shark.
- Participated in daily scrum meetings and iterative development.
Technology: Spark, Spark-Streaming, Spark-SQL, Hadoop, MapReduce, Shark, Hive, Pig, Sqoop, Storm, Kafka, HBase, Cassandra, Tableau, Datameer, Ambari, Oracle, Teradata, SAS, Java 7.0, Scala, Python, Log4J, Junit, MRUnit, SVN, JIRA.
Big Data/Java Technical Consultant.Confidential, O’fallon, MO
Responsibilities:
- Analyzing the business requirements and doing the GAP analysis then transforming them to detailed design specifications.
- Performed Code Reviews and responsible for Design, Code and Test signoff.
- Assigning work to the team members and assisting them in development, clarifying on design issues and fixing the issues.
- Research and recommend various tools and technologies on Hadoop stack considering the workloads of the organization.
- Performed various POC’s in data ingestion, data analysis and reporting using Hadoop, MapReduce, Hive, Pig, Sqoop, Flume, Elastic Search.
- Installed and configured Hadoop.
- Developed multiple MapReduce jobs using java for data cleaning and preprocessing.
- Installed and configured Pig and also written PigLatin scripts.
- Imported/Exported data using Sqoop to load data from Teradata to HDFS/Hive on regular basis.
- Written Hive queries for ad-hoc reporting to the business.
- Experienced in defining job flows using Oozie.
- Setup and benchmarked Hadoop clusters for internal use.
- Involved in managing and reviewing Hadoop log files.
- Responsible for Analyzing, designing, developing, coordinating and deploying web based application.
- Enhanced and optimized the functionality of Web UI using AJAX, XSL, XSLT, CSS, XHTML and JavaScript.
- Developed Web Services using JAX-RPC, JAXP, WSDL, SOAP, XML to provide facility to obtain quote, receive updates to the quote, customer information, status updates and confirmations.
- Extensively used SQL queries, PL/SQL stored procedures & triggers in data retrieval and updating of information in the Oracle database using JDBC.
- Expert in writing, configuring and maintaining the Hibernate configuration files and writing and updating Hibernate mapping files for each Java object to be persisted.
- Expert in writing Hibernate Query Language (HQL) and Tuning the hibernate queries for better performance.
- Used the design patterns such as Session Façade, Command, Adapter, Business Delegate, Data Access Object, Value Object and Transfer Object.
- Involved in application performance tuning and fixing bugs.
Technology: Java, J2EE, Webservices, Hibernate, Struts, JSP, Hadoop, MapReduce, Hive, Pig, Sqoop, Flume, Elastic Search, Cloudera Manager, JDBC, XML, Weblogic Workshop.
Big Data/Java Technical Consultant.Confidential
Responsibilities:
- Involved in designing and development using UML with Rational Rose
- Played a significant role in performance tuning and optimizing the memory consumption of the application.
- Developed various enhancements and features using Java 5.0
- Developed advanced server side classes using Networks, IO and Multi-Threading.
- Lead the issue management team and achieved significant stability to the product by bringing down the bug count to single digits.
- Designed and developed various complex and advanced user interface using Swing.
- Used SAX/DOM XML Parser for parsing the XML file
Technology: Java 5.0, JFC Swing, Multi-Threading, IO, Networks, XML, JBuilder, UML, CVS, WinCVS, Ant & JUnit, Win XP, Unix.
Big Data/Java Technical Consultant.Confidential
Responsibilities:
- Analyzing the business requirements and doing the GAP analysis then transforming them to detailed design specifications.
- Involved in design process using UML & RUP (Rational Unified Process).
- Performed Code Reviews and responsible for Design, Code and Test signoff.
- Assisting the team in development, clarifying on design issues and fixing the issues.
- Involved in designing test plans, test cases and overall Unit and Integration testing of system.
- Development of the logic for the Business tier using Session Beans (Stateful and Stateless).
- Developed Web Services using JAX-RPC, JAXP, WSDL, SOAP, XML to provide facility to obtain quote, receive updates to the quote, customer information, status updates and confirmations.
- Extensively used SQL queries, PL/SQL stored procedures & triggers in data retrieval and updating of information in the Oracle database using JDBC.
- Expert in writing, configuring and maintaining the Hibernate configuration files and writing and updating Hibernate mapping files for each Java object to be persisted.
- Expert in writing Hibernate Query Language (HQL) and Tuning the hibernate queries for better performance.
- Used the design patterns such as Session Façade, Command, Adapter, Business Delegate, Data Access Object, Value Object and Transfer Object.
- Deployed the application in Weblogic and used Weblogic Workshop for development and testing.
- Involved in application performance tuning (code refractory).
- Writing test cases using JUNIT, doing test first development.
- Writing build files using ANT. Used Maven in conjunction with ANT to manage build files.
Technology: EJB, Webservices, Hibernate, Struts, JSP, JMS, JNDI, JDBC, Weblogic, SQL, PL/SQL, Oracle, Sybase, XML, XSLT, WSDL, SOAP, UML, Rational Rose, Weblogic Workshop, OptimizeIt, Ant, JUnit, ClearCase, PVCS, ClearQuest, Win XP, Linux.