Big Data/ Java Lead Developer Resume
BirminghaM
SUMMARY
- 12 yrs. of IT experience in Analysis, Design, Development, Integration, and maintenance on Java/EE based applications, includes 4 yrs. of experience in Big Data Analytics as Hadoop andSparkDeveloper.
- Have experience onSparkCore,SparkStreaming, SparkSql and MLlib for analyzing streaming data.
- Experience inSparkusing Java andSpark - SQL for faster testing and data processing.
- Experience working with major components in Hadoop Ecosystem like Hadoop MapReduce, HDFS, Hive, Pig, Sqoop, HBase, Flume, Oozie, Kafka, Cloudera and Zookeeper.
- Analyzed large data sets of structured, semi-structured and unstructured data using HiveQL, Pig Latin and MapReduce programs.
- Experience in importing and exporting data using Sqoop from/ to SQL Interfaces.
- Knowledge in job/workflow scheduling and monitoring tools like Oozie & Zookeeper.
- Experienced in installing, configuringHadoopmulti-node cluster in Amazon EC2.
- Experience with AWS servicesEC2,VPC,ELB,S3,SQS,SNS,Dynamo Db,Route 53, AWS CLI and SDKs.
- ExcellentJavadevelopment skills using J2EE (JDBC, Servlets, JMS, JNDI, Junit, Hibernate), Web Services (SOAP/Restful) and application servers Apache Tomcat, Jboss and familiar with popular frameworks such as spring MVC.
- Extensive experience in PL/SQL, developing stored procedures with optimization techniques.
- Experience with NoSQL databases like HBase and Amazon Dynamo Db.
- Thorough understanding of Object Oriented Methodology,UMLandDesign Patterns.
- Good understanding of middleware concepts likeConnection pooling,Transactions, and Security.
- Extensively used build toolsMaven,Ant, andJenkins.
- Experience with developing UNIXshell scripts.
- Strong knowledge ofAgileSCRUMmethodology, Waterfall model, and Test Driven Environments and used JIRA for development.
- Experience in the Maintenance projects having analytical and problem-solving skills.
- Extensively worked in Business Analysis, Software Design, and Development in theRetailandFinanceDomains.
PROFESSIONAL EXPERIENCE
Big data/ Java Lead Developer
Confidential, Birmingham
RESPONSIBILITIES:
- DevelopedSparkSQL to load tables into HDFS to run select queries on top.
- UsedSparkStreaming to divide streaming data into batches as an input tosparkengine for batch processing.
- Loaded data from different sources MSSQL and DB2 into HDFS using Sqoop into Hive tables by partitioning.
- Exported the analyzed data to the relational database using Sqoop for visualization and to generate reports for the BI team.
- Developed multiple Map Reduce jobs in java for data cleaning and preprocessing.
- ConfiguredHadoopmulti node cluster with Amazon EC2.
- Written the Apache PIG scripts to process Locate orders into the HDFS data.
- Used Oozie for job automation and cluster coordination services through Zookeeper.
- Worked under Cloudera Distribution and familiar with HDFS.
- Populated HDFS and Cassandra with vast amounts of data using Apache Kafka.
- Developed PL/SQL stored procedures in MSSQL server to retrieve customer data from Relate and Locate order details for sending to the JDA.
- Involved in developing a web application with Java/J2EE for automated deployment with SVN developer kit.
- Worked in provisioning and deploying multi-tenantHadoopclusters on public cloud environment Amazon Web Services (AWS) and on private cloud infrastructure using various AWS components.
- Worked with AWS cloud services (VPC, EC2, S3, RDS, Redshift, Dynamo DB, Lambda, RDS, SNS, and SQS).
- Used AmazonS3 as a storage mechanism and written programs in java to dump the data into S3.
- Implemented Web services for retrieving and updating customer/MVP card details in the Relate and external system Delivra.
- Developed database interface with Hibernate to update and retrieve ‘Ship to Home’ orders from the Locate database.
- Developed Unix Shell scripts for calling stored procedures to retrieve data from MSSQL and calling Java interface to process XML files.
- Used SVN for version control and Mantis for bug tracking.
- Used Maven to automate the compile and build process and written unit test cases with JUnit.
Environment: Hadoop2.x, Spark2.0, YARN, Pig, Sqoop, Oozie, MapReduce, HDFS, Hive, Java, Eclipse, HBase, Flume, Zookeeper, Cloudera, Kafka, MSSQL, UNIX Shell Scripting, Maven, Web Services (SOAP, Restful), SVN, PL/SQL, Linux, JAXB, Junit, Jenkins, Hibernate, AWS (EC2, S3, RDS, ELB, IAM).
Big data/ Java Lead Developer
Confidential, San Francisco
RESPONSIBILITIES:
- Involved in loading data from UNIX file system to HDFS.
- Done the work in importing and exporting data into HDFS and assisted in exporting analyzed data to RDBMS using SQOOP.
- Written various MapReduce programs in Java for data extraction, transformation and aggregation from multiple file formats including XML, JSON, CSV and other compressed file formats and store the refined data in partitioned tables in the EDW.
- Worked with NoSQL database HBase to create tables and store data.
- Used Oozie workflow engine to manage interdependentHadoopjobs and to automate several types ofHadoopjobs such as Java map reduce Hive, Pig, and Sqoop.
- Provided maintenance and support to java based application ORPOS and web applications (SIA, DPP, Shipment, FMR, and ORCO, OIS, RIS, SFS) and worked on bug fixes and implementing enhancements on these applications.
- Involved in implementing EMV pin and chip integration with ORPOS system.
- Involved in migration of all the web applications running on WebSphere to Jboss application server.
- Extensively used J2EE server side components Servlets, JSP, EJB and JMS for the web applications customizations.
- Used Agile scrum based development and maintenance model.
- Extensively used PLSQL procedures and functions for the FMR web application enhancements.
- Analyzed issues for SIA application consuming high CPU utilization and tuned the application.
- Involved in development for RIS application for Old Navy market and Ship from Store application for the Athleta market.
- Involved in creating DAO's and used Hibernate for ORM mapping.
- Developed UNIX shell scripts for generating sales and other reports sales information from ORCO database.
- Developed Web Services using SOAP and Restful APIs.
- Involved in building and deployment of application in Linux environment.
- Used SVN for version control and JIRA for tracking Agile stories and service now for tracking bugs.
- Coordination with the offshore team on project deliverables and providing status updates to the client.
Environment: Hadoop2.x, YARN, Pig, Sqoop, Oozie, MapReduce, HDFS, Hive, Java, Eclipse, HBase, Cloudera, UNIX Shell Scripting, Web Services (SOAP, Restful), SVN, PL/SQL, JMS, MQ, MySQL, Tomcat, Jboss, JIRA, Servlets, JSP, Toad, Splunk, Service Now, Confluence, Spring MVC.
Java Lead Developer
Confidential
RESPONSIBILITIES:
- Used Java, J2EE components for the customization of store web applications and POS application.
- Involved in customizing FMR web application for promoting new brand ‘Athleta’.
- Worked on Java Script, JSP and HTML for User Interface changes.
- Worked on J2EE components EJB and JMS.
- Developed Java utility to purge messages from back out queue (MQ) and to move unprocessed messages to the process queue.
- Worked on Java utility enhancement to implement IVR Lotto Calendar file generation in the POS (360 Commerce) for the Puerto Rico region.
- Involved on Conduit java utility enhancement which converts ORPOS transaction information into legacy POS (360 commerce) controller log.
- Worked on POS (360 commerce) dash board reports were incorrect with the reports which were generating at downstream systems.
- Enhanced UNIX script which inserts store hierarchy details into the central office database.
- Worked in the Linux environment to support store web applications and POS.
- Worked on defects analysis on stores web applications (SIA, DPP, Shipment and FMR) and POS (360 Commerce) application and worked on change requests coming through these applications.
- Prepared technical design documents and provided estimations for the change requests.
- Created process/support documents for new joiners and mentored them.
- Led eight members team and was coordinated with onsite for project deliverables.
Environment: POS, Java, EJB, JMS, JSP, Servlets, UNIX, MQ, Java Scripting, Dimensions, PL/SQL, MySQL, Toad, Oracle, Jboss, Web Sphere, HTML.
Java Lead Developer
Confidential
RESPONSIBILITIES:
- Worked on change requests for customizing the ORPM application
- Worked support defects analysis and providing possible fixes.
- Worked on preparing unit test cases and involved in code reviews.
- Extensively used EJB, Hibernate and DAO model
- Worked on Swing UI to alter the user interface
- Extensively worked on tools Eclipse, SQL Developer and TextPad
Environment: ORPM, Java, EJB, Hibernate, PL/SQL, oc4j, Clear Case.
Java Developer
Confidential
RESPONSIBILITIES:
- Finding ATG server dependencies in the application and replacing with JEE compatible components.
- Worked on replacing business components from EJB1.0 to the EJB3.0.
- Worked on Quartz scheduling machanisms for scheduling jobs.
- Used Java Script, HTML, JSP altering user interface.
- Involved in creating technical design documents and in code reviews.
- Created UML diagrams (sequence/use case) using Rational Rose.
- Involved in project initial setups like defining packages and workspace setups.
- Identifying tighly bounded compononets with the UI and defining in the configuration files.
- Worked on server side components Servlets and EJB components.
- Worked on Java Script, HTML, JSP components for defining the user interface
- Used UML for creation of sequece and usecase diagrams.
- Coordination with the mainframe teams on input and output data formats.
- Deploying application in the WebSphere application server and testing.
Environment: Java, JEE, Jboss, Rational Rose, EJB, UML, JAXB, Quartz, JPA, Struts, Servlets, JSP, Java Script, HTML, UML, and WebSphere.
Java Developer
Confidential
RESPONSIBILITIES:
- Created the build script with Maven and deploying application changes in the Websphere.
- Coordinating with onsite team for application customizations clarifications.
- Used XML technologies like JAXB to parse XML documents.
- Worked on user interface changes in JSP pages.
- Used development tools RAD and Toad
- Worked on PLSQL funtions and procedures.
Environment: Java, J2EE, JSP, Castor 1.0.5,XML,SQL Server 2000,Oracle 10g, Maven, MKS, MQ-series, GSI, JAXB, Web Sphere, JMS, UML, Web Services, PL SQL, Toad.
Java Developer
Confidential
RESPONSIBILITIES:
- Involved in development of Java, J2EE components development.
- Involved in creation of JSP pages with Java Script and HTML
- Used Hibernate for database components mappings.
- Involved in creation of server side components Servelts, EJB.
- Involved in creation of interface for interacting with VisionPlus.
- Used tools Eclipse for development and worked on Confidential frame work X-factor.
Environment: Java, Eclipse, JSP, Servlets, Hibernate, Oracle 10g, CVS, JavaScript, JUnit, EJB, HTML, X-factor, Web Sphere.
Java Developer
Confidential
RESPONSIBILITIES:
- Worked on custmizations for the eChamps web application.
- Involved in the product defects analysis and providing possible resolutions.
- Worked on Server side components Servets, EJB, JMS changes
- Used JSP, JSTL, Java Script and HTML for the user interface changes.
- Prepared build script with Maven and deployed in the application server.
- Used tools WSAD for application development and worked on MVC frame work DS2
Environment: Java, ES, WSAD, JSP, Servlets, Java Scripts, JSTL, MKS, Web Sphere, Maven, HTML, DS2.