Bigdata/hadoop Developer Resume
Plano, TX
SUMMARY
- Around 8+ years of professional IT experience including 3 years in Hadoop/Big data ecosystem and Tableau related technologies.
- Designed and customized attivio product to store data for further analysis using machine learning algorithm
- Basic knowledge of shell
- Architected and designed product services using REST web services spring dynamic modules with OSGI
- Extensive knowledge in creating project plan Reports and matrix using MSP
- Involved in project design and implementation using Object oriented principles and having good implementation knowledge of Java/J2EE design patterns
- Extensively worked in designing application using technologies OSGI Spring dynamic modules J2EE technologies Servlets Swing Applets JSP 1x JDBC JNDI EJB XML 10 and Struts
- Designed application Workflow using JBPM
- Design and Develop Cash Office Solution using Swing Spring Frame work
- Customized and Developed the Oracle 360Commerce retail application using POS Back Office and Central Office using J2EE Design Patterns
- Database designing using Erwin and programming skills including PL/SQL JDBC and SQL with DB2 ORACLE and SQL Se
- Worked on multiple stages of Software Development Life Cycle including Development, Component Integration, Performance Testing, Deployment and Support Maintenance.
- Knowledge of UNIX and shell scripting.
- Establish AWS technical credibility with customers and external parties
- Have flair to adapt to new software applications and products, self - starter, have excellent communication skills and good understanding of business work flow.
- Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.
- Extensive experience working in Oracle, DB2, SQL Server and My SQL database.
- Experience in Design, Development & Analysis of SQL Server 2005/2008 and Microsoft Business Intelligence Tools (MSBI) - SSIS, SSRS and SSA
TECHNICAL SKILLS
Hadoop/Big Data: HDFS, Mapreduce, HBase, Pig, Hive, Sqoop, Flume, Cassandra, Impala, Oozie, Zookeeper, MapR, Amazon Web Serivces, EMR, MRUnit, Spark, Storm, Greenplum, Datameer, Language R, Ignite.
Java & J2EE Technologies: Core Java, Servlets, JSP, JDBC, JNDI, Java Beans
IDE’s: Eclipse, Net beans
Frameworks: MVC, Struts, Hibernate, Spring
Programming languages: C,C++, Java, Python, Ant scripts, Linux shell scripts, R, Perl
Databases: Oracle 11g/10g/9i, MySQL, DB2, MS-SQL Server, MongoDB, Couch DB. Graph DB
Web Servers: Web Logic, Web Sphere, Apache Tomcat
Web Technologies: HTML, XML, JavaScript, AJAX, SOAP, WSDL
Network Protocols: TCP/IP, UDP, HTTP, DNS, DHCP
ETL Tools: Informatica, IBM Infosphere, Qlikview and Cognos
PROFESSIONAL EXPERIENCE
Confidential, Plano, TX
BigData/Hadoop Developer
Responsibilities:
- Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
- Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
- Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
- Managed and reviewed Hadoop log files.
- Tested raw data and executed performance scripts.
- Shared responsibility for administration of Hadoop, Hive and Pig.
- Responsible for developing map reduce program using text analytics and pattern matching algorithms
- Involved in in porting data from various client servers like Remedy Altiris Cherwell OTRS etc into HDFS file system
- Assist the development team to install single node Hadoop 224 in local machine
- Coding REST Web service and client to fetch tickets from client ticketing servers
- Facilitating Sprint planning Retrospection and closer meeting for each spring and help capture various metrics like team status
- Participated in architectural and design decisions with respective teams
- Developed in-memory data grid solution across conventional and cloud environments using Oracle Coherence.
- Work with customers to develop and support solutions that use our in-memory data grid product.
- Used Pig as ETL tool to do transformations, event joins, filters and some pre-aggregations before storing the data onto HDFS
- Optimizing Map reduce code, pig scripts, user interface analysis, performance tuning and analysis.
- Analysis with data visualization player Tableau.
- Writing Pig scripts for data processing.
- Used Hive to analyze the partitioned and bucketed data and compute various metrics for reporting on the dashboard.
- Loaded the aggregated data onto DB2 for reporting on the dashboard.
Environment: BigData/Hadoop, JDK1.6, Linux, Python, Java, Agile, RESTful Web Services, HDFS, Map-Reduce, Hive, Pig, Sqoop, Flume, Zookeeper, Oozie, DB2, NoSQL, HBase and Tableau.
Confidential, Richmond, VA
Big Data/Hadoop Developer
Responsibilities:
- Worked on evaluation and analysis of Hadoop cluster and different big data analytic tools including Pig, Hbase database and Sqoop.
- Responsible for building scalable distributed data solutions using Hadoop.
- Involved in loading data from LINUX file system to Hadoop Distributed File System.
- Created Hbase tables to store various data formats of PII data coming from different portfolios.
- Experience in managing and reviewing Hadoop log files.
- Exporting the analyzed and processed data to the relational databases using Sqoop for visualization and for generation of reports for the BI team.
- Installed Oozie workflow engine to run multiple Hive and pig jobs.
- Develop/capture/document architectural best practices for building systems on AWS
- Analyzing large amounts of data sets to determine optimal way to aggregate and report on these data sets
- Worked with the Data Science team to gather requirements for various data mining projects.
- Developed the Pig and Hive queries as well as UDF'S to pre-process the data for analysis.
- Importing and exporting data into HDFS and Hive using Flume.
- Developer of data quality monitoring and systems software in Python with Flask, coding in Python working on news content systems and infrastructure.
- Analyzed large data sets by runningHive queriesandPig scripts.
- Created dash boards using Tableau to analyze data for reporting.
- Support for setting up QA environment and updating of configurations for implementation scripts with Pig and Sqoop.
Environment: Hadoop, HDFS, Pig, Sqoop, HBase, Shell Scripting, Linux, JSON, AJAX, Informatica and RDBMS
Confidential
Java Developer
Responsibilities:
- Collected, understood, and transmitted the business requirements for the project
- Used agile development methodology.
- Involved in analysis, design and development of the product and developed specifications that include Use Cases, Class Diagrams, and Sequence Diagrams.
- Involved in build, staging, Testing and deployment of J2EE applications.
- Project Planning, monitoring and control for small and large projects
- Involve in Requirement Analysis, Design and Implementation activities.
- Worked in UI team to develop new customer facing portal for Long Term Care Partners.
- Deployment and Post deployment support.
- Creating test cases and technical documents.
- Developing front end GUI using Java Server Faces.
- Implementing Java API using core java
- Integrating front end with API.
- Developed the User Interfaces using Struts, JSP, JSTL, HTML, AJAX and JavaScript.
- Involved in each phase of SDLC to ensure smooth and proper functioning of the project.
- Retrieved source data using SQL for data analysis.
- Performed User Acceptance Testing.
- Developed Business Flow Diagrams, Dataflow diagrams, Activity diagrams and Use cases diagrams using MS Visio.
Environment: MS Office 2007(word, PowerPoint, excel), MS Visio, Agile, Core Java (1.4), Oracle, Apache Tomcat, JSP, JSTL and Linux
Confidential, NY
Java Developer
Responsibilities:
- Developed the system by following the agile methodology.
- Involved in the implementation of design using vital phases of the Software development life cycle (SDLC) that includes Development, Testing, Implementation and Maintenance Support.
- Experience inAgile Programmingand accomplishing the tasks to
- Used Ajax and JavaScript to handle asynchronous request, CSS to handle look and feel of the application.
- Involved in design ofClassDiagrams, Sequence DiagramsandEvent Diagramsas a part of Documentation.
- Developed the presentation layer using CSS and HTMLtaken from Bootstrap to develop for multiple browsers including mobiles and tablets.
- Extended standard action classes provided by theStruts frameworkfor appropriately handling client requests.
- ConfiguredStruts tilesfor reusing view components as an application of J2EE composite pattern.
- Involved in the integration of Struts and Spring 2.0 for implementing Dependency
- Injection (DI/IoC). Developed code for obtaining bean references inSpringIoC framework.
- Developed the application onEclipse.
- Involved in the implementation of beans inApplication.
- Representation from MVC model to Oracle Relational data model with a SQL-based schema.
- DevelopedSQL queriesandStored ProceduresusingPL/SQLto retrieve and insert into multiple database schemas.
- Performed Unit Testing UsingJUnit andLoad testing usingLoadRunner.
- ImplementedLog4Jto trace logs and to track information.
- Applied OOAD principles for the analysis and design of the system.
- Used Websphere Application Server to deploy the build.
- Developed front-end screens using JSP, HTML, JQuery, JavaScript and CSS.
- Used Spring Framework for developing business objects.
- Performed data validation in Struts Form beans and Action Classes.
- Used Eclipse for the Development, Testing and Debugging of the application.
- SQL Developer was used as a database client.
- Used WinSCP to transfer file from local system to other system.
- Used Rational ClearQuest for defect logging and issue tracking.
Environment: JQuery, JSP, Servlets, JSF, JDBC, HTML, JUnit, JavaScript, XML, SQL, Maven, RESTfulWeb Services, UML.
Confidential, Ohio
Java Developer
Responsibilities:
- Gathered and analyzed user/business requirements and developed System test plans.
- Managed the project using Test Director, added test categories and test details.
- Involved in using various PeopleSoft Modules.
- Performed execution of test cases manually to verify the expected results.
- Created Recovery Scenarios for the application exception handling using recovery manager.
- Implementedcross cuttingconcerns as aspects at Service layer usingSpring AOP.
- Involved in the implementation of DAO objects using spring - ORM.
- Involved in the JMS Connection Pool and the implementation of publish and subscribe usingSpring JMS. Used JMS Template to publish andMessage Driven POJO (MDP)to subscribe from theJMS provider.
- Involved in creating theHibernate POJO’s and developedHibernate mapping Files.
- UsedHibernate, object/relational-mapping (ORM) solution, technique of mapping data
- Involved in doing the GAP Analysis of the Use cases and Requirements.
- Test Scenarios developed for Test Automation.
Environment: Windows 98,Java1.4, C, C++, JSP, Servlets, J2EE, PHP, Multi threading, OO design, JDBC, HTML, RAD, WSAD.