Specialist, Resume,
3.00/5 (Submit Your Rating)
,
SUMMARY
- 8+ years’ experience in design and development of web applications on Java/J2EE platform.
- 3+ years’ experience in Hadoop, BigData and NOSQL.
- Extensive expertise in designing, coding and testing of Web applications using Core JAVA and J2EE Technologies (Servlets, JSP, JDBC, JMS, Web Services), Struts, Spring, Hibernate, Linux, SNMP and Apache Hadoop.
TECHNICAL SKILLS
- Key Languages: Java/J2ee
- Programming Language: Core Java (1.5 and 1.6)
- Web Technologies: Servlets, JSP, JDBC, JMS, HTML, XML, Web Services, Spring Framework, SNMP, SMSC, SMPP, Struts - 2.0, Hibernate, SS7, SIP Protocol (IMS)
- Web Server: Apache Tomcat 5.x
- Application Server: BEA Web logic 9.2, 10.0, Jboss 5.0
- Tools: Apache Ant, Maven, CVS, ClearCase, SVN
- IDE: IntelliJIDEA 6.0, Eclipse
- Databases: Oracle 10g, PL/SQL, NoSQL
- Operating Systems: Windows XP/NT/2000, Linux (RHEL 3.1/5.1), Solaris
- Testing Tools: Soap UI, Junit
- Hadoop / BigData: Purveyor of competitive intelligence and holistic, timely analyses of Big Data made possible by the successful installation, configuration and administration of Hadoop ecosystem components and architecture.
- Two years’ experience installing, configuring, testing Hadoop ecosystem components.
- Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture.
- Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review.
- Familiar with data architecture including data ingestion pipeline design, Hadoop information architecture, data modeling and data mining, machine learning and advanced data processing. Experience optimizing ETL workflows.
PROFESSIONAL EXPERIENCE
Specialist,Confidential,
Responsibilities:
- Recommendation for the unused licenses, if any.
- Highlight all the licenses reaching the utilization threshold.
- Licenses procured and not yet activated/not yet installed.
- Suggestions for the any kind of deviations observed.
- Developed jobs in Pentaho to parse the raw data, populate staging tables and store the refined data into MySql database.
- Create a Front end page to download report into Excel and PDF format
- Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
- Managed and reviewed log files.
- Tested raw data and executed performance scripts.
- Shared responsibility for administration of License Audit Tool.
Environment:
- Big Data Analytics Tool: Pentaho
- Data Processing Language: Pentaho big data analytics tool
- Database and BigData ETL: PIG, HBASE, Sqoop, Apache Lucene
- Programming Language: Java 1.6, Struts 2.0, Hibernate
- Scripting: JSP, Javascript, CSS, Ajax
- DataBase: MySql
- Operating Systems: Linux
Hadoop/BigData Development,
Confidential,Responsibilities:
- Developed Map Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW.
- Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW tables and historical metrics.
- Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
- Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
- Managed and reviewed Hadoop log files.
- Tested raw data and executed performance scripts.
- Shared responsibility for administration of Hadoop, Hive and Pig.
Environment:
- File System: HDFS
- Big Data Technology: Apache Hadoop 1.1
- Data Processing Language: MAP-Reduce
- Database and BigData ETL: PIG, HBASE, Sqoop, Apache Lucene
- Programming Language: Java 1.6
- DataBase: Oracle 10g, NoSQL
- Operating Systems: VMWARE workshop, Ubuntu Image
Hadoop Developer/Administrator,
Confidential,
Responsibilities:
- Installed and configured MapReduce, HIVE and the HDFS; Assisted with performance tuning and monitoring.
- Created HBase tables to load large sets of structured, semi-structured and unstructured data coming from UNIX, NoSQL and a variety of portfolios.
- Supported code/design analysis, strategy development and project planning.
- Created reports for the BI team using Sqoop to export data into HDFS and Hive.
- Developed multiple MapReduce jobs in Java for data cleaning and preprocessing.
- Assisted with data capacity planning and node forecasting.
- Collaborated with the infrastructure, network, database, application and BI teams to ensure data quality and availability.
- Administrator for Pig, Hive and Hbase installing updates, patches and upgrades.
Environment:
- File System: HDFS
- Big Data Technology: Apache Hadoop 1.1
- Data Processing Language: MAP-Reduce
- Database and BigData ETL: PIG, HBASE, Sqoop, Apache Lucene
- Programming Language: Java 1.6
- DataBase: Oracle 10g, NoSQL
- Operating Systems: VMWARE workshop, Ubuntu Image
Project Name: Hadoop/BigData Development
Confidential,
Responsibilities:
- Explore Hadoop framework and its related software and deduce it’s applicability in Banking.
- Created MapReduce programs and PIG Data Analyzer scripts.
- Creation and Maintenance of the Hadoop Cluster
- Wrote MapReduce program in Java to extract information from huge volume of files, load it into Hbase.
- Performed code reviews and supervised junior developers
- Writing ETL scripts using Sqoop to transfer required data from Hadoop to the Relational database(Oracle)
- Performing unit testing. writing analytic queries using PIG scripting
- Responsible for integration and testing of systems
Environment:
- File System: HDFS
- Big Data Technology: Apache Hadoop 1.1
- Data Processing Language: MAP-Reduce
- Database and BigData ETL: PIG, HBASE, Sqoop
- Programming Language: Java 1.6
- Relational DataBase: Oracle 10g
- Operating Systems: VMWARE workshop, Ubuntu Image
Global Incident Management,
Confidential, USA
Responsibilities:
- Handled the tasks of providing technical direction for developing, delivering and maintaining technology-based business solutions
- Creates the document. Ensures that the description of the architecture is accurate and complete and that all deviations are clearly documented.
- Responsible for updating Delivery manager regarding status of development efforts
- Responsible for creating and executing development plans by making FP Count( Project Estimation Count)
- Performed code reviews and supervised junior developers
- Using Tomcat as an Web Server for deployment
- Performing unit testing using Junit 1.4.
- Responsible for integration and testing of systems
- Developed activity diagrams in requirements modeling, class and sequence diagrams in analysis and design models
Environment:
- Java 1.5, JSP, Struts-2.0, Hibernate, Junit 1.4, XML, ANT, SVN, Tomcat as an Web Server, Linux/Window, Oracle 10g, Eclipse Galileo as an IDE.
Telecom,
ConfidentialResponsibilities:
- Handled the tasks of providing technical direction for developing, delivering and maintaining technology-based business solutions
- Responsible for updating project manager regarding status of development efforts
- Responsible for creating and executing development plans
- Performed code reviews and supervised junior developers
- Using Jboss 6.1 as an Application Server for deployment
- Performing unit testing using Junit 1.4.
- Responsible for integration and testing of systems
- Developed activity diagrams in requirements modeling, class and sequence diagrams in analysis and design models
Environment:
- Java 1.6, JSP/JSTL, Spring, Junit 1.4, JDBC, XML, ANT, Clearcase, Jboss 6.1 as an Application Server, UNIX, SNMP (TRAP), Eclipse Galileo as an IDE. Weblogic 9.2 as an Application Server, Oracle 10g, PL/SQL, Linux 5.1/3.1 as OS,SNMP(TRAP), and IntelliJIDEA 6.0 as an IDE.