Big Data Engineer Resume
Mountain View, CA
SUMMARY
- Bachelor of engineering computers with 11 yrs years of total experience in product development with Java, J2EE Technologies and 4 years of Big Data experience.
- Strong Technical Expertise skills in Java, J2EE (Servlets, JSP, Struts, EJB, JMX, JMS (TIBCO EMS, IBM MQ),) and web Services (JAX - WS and JAX-RS), JSF, JavaScript, Java Mail (POP 3, SMTP,IMAP), JDBC, CSS, AJAX, Springs, Swings, Redis Cache, EhCache, and (ORM) Hibernate, Cache(memCache, EhCache.Coherence distributed Cache)
- Having very good working Experience on OOAD, OOPS, Apache Shiro, UML diagrams, Socket Programming, XML, Flex, StAX, DOM,SAX, XSLT, WS-Security, Xpath, JAXP, JAXB, WSDL/SOAP,velocityScripts,GoogleCollectionAPI,,PrimeFaces,JSON,Maven,Perl,Ant,EMMA,Junit,Corbertura,Selenium,Hammurapi,svn,cvs,Sonar,JCA,JTA,JProfiler,JNI,JSTL,JNDI,SSL,SSO(Siteminder),Hudson,JVMTI,(Javassist )ByteCodeInstrumentation,proguard(javaObfuscation),.PL/SQL,,JQuery,UNIX(ShellScripting),RSS,XHTML,HTML5,JIRA, JAAS(JSSE,JCE).
- Hands on good experience with Java/J2ee Design patterns
- Extensive and thorough developmental experience in Multithreading, algorithm design, Data structure.
- Expertise to HADOOP, HDFS, Parallel processing, pig script, Cassandra.
- In-dept experience in software product development.
- Experience in designing, implementing and maintaining high performance, high availability, & reliable, large-scale web based software applications.
- Advanced technical skills including web application development, SQL expertise, designing and developing web services, database and application performance optimizations.
- Experience with Cloud hosting(Google App Engine), Google Web Toolkit (GWT), JPA with CloudSQL, and good experience on SaaS based products.
- Hands on good experience on ESB(Mule), Web-Services(SOAP), Rest Services,AWS.
- Having very good work experience on SOA, BPEL Engine (Apache ODE), and CORBA.
- Efficient in building pig, hive and map-reduce scripts.
- Involved in loading data from UNIX file system to HDFS.
- Experience of customization of open source products.
- Created tables in HIVE by partitioning and bucketing for granularity and optimization of HIVEQL.
- Involved in identifying job dependencies to design workflow for Oozie and resource management for YARN.
- Used Hive and created Hive tables and involved in data loading and writing Hive UDFs.
- Expertise writing PIG scripts and UDFs.
- Load data from various data sources into HDFS using Flume.
- Efficient in building pig, hive and map-reduce scripts.
- Experience leading development life cycle process and best practices.
- Experience leading the delivery of large-scale distributed systems.
- Experience in application server clustering, performance tuning and scalability testing.
- In-depth experience in highly scalable and clustered IT Product development and methodologies.
- Have expertise in Product Architecting, App Development and release management.
- Agile model (SCRUM) understanding & good experience in scrum master role.
- Domain expertise in telecom, Investment Banking and Insurance.
PROFESSIONAL EXPERIENCE
Confidential, Mountain View CA
Big Data Engineer
Responsibilities:
- Designed and developed highly scalable SaaS applications on cloud .
- Technical Analysis of Business Requirements.
- Work with the Business Analyst and Architecture Teams independently to understand business and technical requirement.
- Technical Architecture and design.
- Designed and developed highly scalable jobs Spark streaming jobs on Hadoop platform on java and maintained some jobs on Scala.
- Create design and documentation of projects for maintenance and training purposes.
- Used Asynchronous programming techniques, including multi-threaded application design and development
- Architect stable, efficient, and scalable cloud platforms.
- Designed and Developed highly performance multi-threaded agents for extracting data and transferring data from client systems to the Confidential application server using java netty4 persistent connection.
- Develop architectural diagrams to illustrate architectural complexities and interactions.
- Conduct vendor analysis and proof-of-concepts for new technologies/solutions.
- Ensure project successfully implements the designed solution following architectural tenets.
- Configured and integrated Redis Cache for both Caching and messaging with high performance cluster.
- Written Lua scripts at Redis Cache server for filtering cache for search and heavy clean up of cache.
- Written jobs using Multithreading and best optimized data structure.
- Worked on Performance tuning of JVM, application code and Memory profiling of application.
- Experience optimizing and performance tuning SQL queries
- Configured and setup HA Proxy between agents/connectors to handle complex scenarios.
- Configured and setup NGINX Proxy to handle complex scenarios for https traffic.
- Integrated the Custom develop Processes files t be checked In and deployed to other platforms using GIT integration with the java code .
- Designed and integrated SAML and OAuth with application for Identity Management and Authorization.
- Worked on both JAX-RS (Rest ) web services and JAX-WS (SOAP) based services for customers to get processed data to integration FusionOps . written automated tests using Junit and integrated with CI.
- Designed to make app server reliably fast independent from its cluster for easy start up though they still clustered.
- Developed simple and complex MapReduce programs in Java for Data Analysis
- Implemented transfer of data over Apache Knox with HTTPS(SSL) over webhdfs to HDFS with basic authentication.
- Created tables in HIVE by partitioning and bucketing for granularity and optimization of HIVEQL.
- Generated log reports on Splunk based on filters.
- Migrated from Jboss to Jetty .
- Worked on Dockerizing application.
- Used MongoDB and Hbase for multiple use cases.
- Integrated application with quartz for scheduling integration jobs.
- Involved in loading data from UNIX file system to HDFS .
Environment: Java7, JMS, Redis, MySQL, Spring(MVC, Integration, IOC, Aspects, Spring ORM, Spring DAO), XML, Xpath, JSON,SOAP,JSON,CXF, WinScp3, Mongo DB, Putty, ant, web services (JAX-WS,JAX-RS ), JAXB, Java Mail (POP 3, SMTP,IMAP), Git, WSDL, UNIX (Shell Scripting,awk,nawk), Elasticsearch, LOG4J, LDAP, corbertura test coverage report,, EMMA code coverage, Apache Knox, weblogic, Active Directory, Quartz, eclipse, Jenkins (for continuous integration),wiki, sonar, SCRUM methodology,, SSL,SSO,. cron job(oozie), Hadoop, JPA(Hibernate), Java, Tomcat, Hibernate, Hive, Hbase, Mongo DB, Sqoop,,Oozie, SOAP, JProfiler, Restful Web Services, Java and J2ee Design patterns, Infinispan, Elk,GIT, HBASE(NO SQL),, Jboss, Jetty, Coherence Cache, Splunk .Ajax, SCRUM,methodology
Confidential, Philadelphia
Java Technical Lead
Responsibilities:
- Involved in complete coding of the project.
- Develop architectural diagrams to illustrate architectural complexities and interactions.
- Work with business owners, analysts, solution engineers, development teams and infrastructure services
- Communicate application and data architectures.
- Architect solution, preparing design document, coding, code reviews, Junit test cases, review test and code coverage reports .
- Written jobs using Multithreading and best optimized data structure.
- Ensure project successfully implements the designed solution following architectural tenets.
- Conduct vendor analysis and proof-of-concepts for new technologies/solutions.
- Playing SCRUM master role for product end to end delivery.
- Worked on loading and transformation of large sets of structured, semi structured and unstructured into hadoop system.
- Having good working knowledge on MongoDB and Hbase.
- Developed simple and complex MapReduce programs in Java for Data Analysis.
- Implemented Partitioning, Dynamic Partitions, Buckets in HIVE.
- Created HIVE tables and provided analytical queries for business user analysis
- Extensive knowledge on PIG scripts using bags and tuples.
- Created tables in HIVE by partitioning and bucketing for granularity and optimization of HIVEQL.
- Involved in identifying job dependencies to design workflow for Oozie.
- Efficient in building pig, hive and map-reduce scripts.
- Involved in loading data from UNIX file system to HDFS.
- Installed and configured Pig, Hive and also written Pig and Hive UDFs.
Environment: Java6, Struts, JMS, Hibernâte, Spring(MVC, Integration, IOC, Aspects, Spring ORM, Spring DAO), XML, Xpath, Ajax, JSON, WinScp3, node.jsAngular js, Mongo DB Putty, ant, webservices (JAX-WS,JAX-RS ), JAXB, JavaMail (POP 3, SMTP,IMAP),SVN, UNIX (Shell Scripting,awk,nawk) LOG4J, EhCache, cobertura test coverage report, JQuery, EMMA code coverage, ADF faces, PrimeFaces, Prime Faces Mobileweblogic, Apache,Quartz, eclipse,cruise control(for continuous integration),wiki,sonar,Oracle RAC cluster,SCRUM methodology,SOLR, SSL,SSO(Siteminder),Selenium, SSO(Siteminder),cron job(oozie), Talend(ETL)pig scripts, Hadoop, ESB(Mule),Python scripting, Java, Reduce, Pig, Hive, Hbase, Mongo DB, SqoopOozie, SOAP, Restful Web Services,JSF, Jenkins, Java and J2ee Design patterns,GIT, HBASE(NO SQL),Cassandra, Tomcat,Coherence Cache,Splunk,mySql.
Confidential
Java Technical Lead
Responsibilities:
- Develop architectural diagrams to illustrate architectural complexities and interactions.
- Work with business owners, analysts, solution engineers, development teams and infrastructure services to communicate application and data architectures.
- Involved in complete coding of the project.
- Prepared Use Case documents, low level design, Package Diagrams, Class Diagrams and Sequence Diagrams.
- Architect solution, preparing design document, coding, code reviews, Junit test cases, review test and code coverage reports .
- Ensure project successfully implements the designed solution following architectural tenets.
- Conduct vendor analysis and proof-of-concepts for new technologies/solutions.
- Wrote and modified Build shell scripts, automate release processes..
- Played SCRUM master role for product delivery.
Environment: Java, Struts, JMS, Hibernâte, Springs, JSF, XML, WinScp3, Putty, ant, webservices (JAX-WS,JAX-RS ), SVN, Perforce, UNIX (Shell Scripting,awk,nawk),Hudson, Jboss, StAX JAAS(JSSE,JCE), JQuery, Restful Web Services, SOAP, REST,GoogleCollectionAPI Apache Shiro., RichFaces,JSF, TIAA-CREF, IBM Web Spheretools suit
Confidential
Java Technical Lead
Responsibilities:
- Responsible to Liaise with Users to gather requirements/issues.
- Preparing design document, Junit test cases and coding.
- Involved in complete coding of the project
- Responsible to assign modules to the team members and conducting reviews.
- Ensure project successfully implements the designed solution following architectural tenets.
Environment: Java, EJB 2.1, JMS (TIBCO EMS, IBM MQ), RMI, XML, WinScp3, Putty, CVS, Perforce, UNIX (Shell Scripting, awk, nawk), FIX 4.0, son, JNIGE ASSET MGNT Hid India
Confidential
Java Technical Lead
Responsibilities:
- Specifying functionality, requirement analysis & detailed technical design.
- Assisted in architecture & framework creation.
- Prepared Use Case documents for the modules.
- Created the low level design for the modules, which included Package Diagrams, Class Diagrams and Sequence Diagrams.
- Implemented Java/J2EE Design Patterns such as Business Delegate, Front Controller, MVC, Session Facade, Value Object, DAO, Service Locator, Singleton, Prototype, Adaptor, Factory method, Observer, Flyweight, Builder and Abstract Factory.
- Involved in complete coding of the project
- Used Asynchronous JavaScript and XML (AJAX) for better and faster interactive Front-End.
- Wrote and modified Build shell scripts for updating of Database, automate release processes and production support.
- Reviewed and assisted junior developers with design and development.
Environment: Java, JavaScript, JSP, Ajax, XML, XSLT, Oracle, Jaunit, Jboss, Hibernate,Springs,EJB,JMS, StarUML, JIRA, Flex, WinScp3, Putty, VSS 8.0, Multithreading, Hammurapi,PL/SQL,Struts,UNI(Shell