Senior Consultant Resume
San, JosE
SUMMARY
- 10 Years of total experience in software development executing complete software development life cycle, processes and standards.
- 3 years of experience in leading and mentoring the team.
- 3 years of experience in Bigdata/Hadoop Related technology stack
- 6 years of experience in Java and J2EE related Tech stack.
- 5 years of experience in RDBMS that includes Oracle, MySQL, PostgreSQL
- 3 years of experience in Unix and Shell scripting
- Worked on large scale distributed cluster that handles 2 peta bytes of data
- Worked on preparing big data lake, ingesting data from different sources like SFTP, RDBMS and streaming data.
- Worked on batch processing of data using Map Reduce on HDFS.
- Worked on data preprocessing using Pig Scripting
- Worked on Ingesting data into Hive tables. Batch processing and interactive processing on the data in the hive tables.
- Expertise in working with Hive partitions. Extracting the date partition from input file or from date field.
- Worked on Hive optimizations like Bucketing, Map joins, Skew joins, predicate push down
- Worked on loading the fixed length data sets using RegexSerDe in Hive.
- Worked on storing the data in ORC format and reading the data from ORC in hive.
- Experience in building ETL process with big data tech stack.
- Worked on processing the data with Spark and SparkSQL.
- Worked on processing structured data using Spark SQL using Data Frames
- Worked on importing data effectively from various databases (Teradata, Oracle, SQL server) using Sqoop.
- Experience in performance tuning of Sqoop jobs.
- Worked on storing data into Hbase NoSQL database and retrieving the data form HBase.
- Experience in processing the data with TEZ processing engine.
- Worked on Apache Kafka and Storm for processing the streaming data.
- Experience in Cloudera, Hortonworks and Pivotal Hadoop distributions.
- Worked on developing web applications using Java, JDBC, Servlet, JSP, Struts, Hibernate and Spring.
- Worked on deploying and managing the applications in Tomcat and JBoss servers.
- Worked on developing portlets using Vaadin Web UI and Customizing Liferay enterprise portal.
- Worked on developing Gmail contextual gadgets using Google Gadgets API and integrating employee contact information with Google contacts using Google contacts and users API.
- Worked on integrating the different systems using JBOSS Enterprise Service Bus (ESB).
- Worked on writing SQL queries and PLSQL programming.
- Experience in SVN and GIT version control systems for managing the code versions.
- Worked on building applications using ANT and Maven
- Worked in the projects that uses Agile methodology.
- Proficient in handing the team by proving the proper guidance and mentoring.
- Well exposed to onsite and offshore model.
TECHNICAL SKILLS
Big Data: Hadoop, Map Reduce, HDFS, Pig, Hive, Sqoop, HBase, Spark, TEZ, Apache Kafka, Storm Oozie, Cloudera, Hortonworks, Pivotal
Databases: Oracle, MySQL, Teradata and PostgresSQL
Languages: Java, Scala, Python, XML, SQL, PLSQL
Java and J2EE related tech stack: JDBC, Servlet, JSP, Design Patterns, Struts, Spring, Hibernate, Liferay Enterprise Portal, Vaadin Web UI, JBoss ESB, HTML, CSS, JS, AJAX
Tools: and Utilities: Eclipse, Oracle Jdeveloper, NetBeans, SQL Developer, Toad and SOAP UI
Servers: Tomcat, JBoss
Operating Systems: Red Hat Enterprise Linux, CentOS, Windows
Version Control & CI: GIT, SVN, Jenkins
PROFESSIONAL EXPERIENCE
Confidential, San Jose
Senior Consultant
Environment: Hadoop, Hive, Pig, TEZ, Spark, Storm, Kafka, Sqoop, Java, Scala, Python, Cloudera
Responsibilities:
- Responsible for leading the team and mentoring the team.
- Participating in standup calls and clarifying the requirements with the offshore team
- Worked on preprocessing of data sets using pig scripting.
- Worked on ingesting data from different sources into Hive staging tables then transforming the data into base tables.
- Worked on creating Hive partitions extracting date partition from input file.
- Worked on Hive Bucketing for effective sampling and optimization of jons using map jons.
- Worked on Hive Optimizations like map joins, skew joins and predicate pushdown.
- Worked on writing data into ORC file format and reading the data from ORC in hive tables.
- Worked on importing data from different databases using Sqoop and tuning the sqoop jobs.
- Worked on data processing with Spark using Scala.
- Worked on Spark SQL for processing of structured datasets.
- Used Data Frames in Spark SQL.
- Worked on processing streaming data using Apache Kafka and Storm.
Confidential, San Jose
Senior Programmer Analyst
Environment: Hadoop, HDFS, Pig, Hive, Sqoop, HBase, Oozie, Hortonworks
Responsibilities:
- Model the solution based on the functional requirements
- Involved right from requirement analysis to delivery
- Involved in writing Batch processing jobs using pig and Hive
- Involved in Importing data from databases using Sqoop
- Worked on storing data into and retrieving data from HBase.
- Worked on creating Hive tables and partitions
- Involved in Integration testing, Unit testing and final delivery
- Coordinating with the global team for the clarifications
Confidential
Environment: Java, Liferay Portal 5.2.3, Vaadin Web UI, Oracle 11g, MySQL, Tomcat 5.5.28
Responsibilities:
- Involved right from requirement analysis to delivery & sustaining
- Involved in Unit testing, Integration testing
- Worked on developing Liferay portal extensions
- Involved in customizing the Authentication and Authorization process.
- Integrated the LDAP authentication with the portal.
- Involved in developing Lilferay Portlets according to business needs.
Confidential
Environment: Java, JBoss ESB, XML, Oracle 11g, PostgreSQL 8.4, Eclipse 3.4, SQL Developer,SOAP UI.
Responsibilities:
- Involved right from requirement analysis to delivery & sustaining
- Involved in Unit testing, Integration testing
- Involved in developing transformations between two systems using JBoss ESB
- Involved in developing services to ingrate different systems
Confidential
Environment: Java, GWT, Oracle 11g, Tomcat 6, Eclipse 3.4, SQL Developer
Responsibilities:
- Worked on Customizing the SCC forms and Reports.
- Worked on developing Vendor Consigned Inventory, Supplier Managed Inventory, Release to Forecast and Purchase orders modules.
- Worked on customizing authentication and authorization process and integrating with LDAP and SAML authentication.
- Involved in unit testing and Integration testing.
Confidential
Environment: Google Gadgets API, XML, HTML, Javascript, Web Services, Google Contacts API, Java, HTML, Javascript.
Responsibilities:
- Involved right from requirement analysis to delivery & sustaining
- Worked on developing entire functionality of the application.
- Testing and deploying the application
Confidential
Software Engineer
Environment: Java, Servlets, JSP, JDBC, Struts 1.2 and XML, MySQL, Tomcat 6.0
Responsibilities:
- Analysis of the requirements provided by the client.
- Design and development
- Involved in unit testing and integration testing
- Responsible for delivery phase.
Confidential
Software Developer
Environment: Java, JSP, Struts 2.x, Hibernate, Spring, Tomcat 6.0, Oracle 9i, My Eclipse 6.6
Responsibilities:
- Involved in all modules of the project
- Involved in all the model, view and controller phases
- Involved in developing the connectivity to the database using ORM (Hibernate).
- Involved in writing SQL queries and calling stored procedures in DAO.
- Involved in developing Spring Dependency Injection to make loosely coupled objects.
- Developed GUI using the JSP, JavaScript and Used the Struts2 tags and Tiles.
- Involved in unit testing and integration testing