We provide IT Staff Augmentation Services!

Sr Software Engineer Resume

5.00/5 (Submit Your Rating)

Denver, CO

SUMMARY

  • Total 8+ years of experience in Software Development using Bigdata, Hadoop, Spark, Java/J2EE
  • Excellent understanding ofHadoopArchitecture and Daemons such as HDFS, Name Node, Data Node, Job Tracker, Task Tracker.
  • 3 years of experience in Bigdata related technologies likeHadoopframeworks, Map Reduce, Hive, HBase, PIG, Sqoop,Spark, Kafka, Flume, Zookeeper, Oozie, and Storm
  • Strong functional experience on Cloudera Data Platform using VMware Player, Cent OS 6 Linux environment as well asHadoopdistributions like Cloudera, MapR and HortonWorks
  • Adept at writing complex MapReduce programs that work with different file formats like Text, Sequence, Xml, JSON.
  • Expertise in Database Design, Creation and Management of Schemas, writing Stored Procedures, Functions, DDL and DML SQL queries and writing complex queries for Oracle.
  • Sound noledge of No - SQL database HBase.
  • Installation, configuration, Management, supporting and monitoringHadoopcluster using various distributions such as ApacheSPARK,
  • Using HBase to load and retrieve data for real time processing using Rest API.
  • Using Sqoop to import data into HDFS / Hive from RDBMS and exporting data back to HDFS or HIVE from RDBMS.
  • Extending HIVE and PIG core functionality by using custom User Defined Function's (UDF), User Defined Table-Generating Functions (UDTF) and User Defined Aggregating Functions (UDAF) for Hive and Pig.
  • BI tools like Tableau for report creation and further analysis from the front end.
  • Proficient in using various design patterns such as SpringMVC, Data Transfer Object, Value Object, Singleton, Service Locator, Session Façade, Factory Pattern and DAO.
  • Extensively used DAO patterns, including mapping DAO objects, configure file and classes to interact with database using Hibernate.
  • Involved in Software Development Life Cycle (SDLC) phases which include Analysis, Design, implementation, Testing and Maintenance.
  • Extensive use Web/Application Servers like Apache Tomcat, JBOSS, WebLogic and IBM WebSphere.
  • LOG4J for logging information and exceptions, Apache ANT for making build scripts, VSS, Clear Case for version control system, Eclipse.
  • Involved in business analysis and technical design sessions with business and technical staff to develop requirements document and ETL design specifications.
  • Well versed in using Software development methodologies like Agile methodology and Scrum software development processes.
  • Experience in production, quality assurance(QA), System Integration Testing and User Acceptance Testing.
  • Experience in client design and validations using Bootstrap, JavaScript and AJAX.
  • Experience on Markup languages like HTML, AngularJS
  • Worked with XML parsers like JAXB.
  • Worked with version control systems like Subversion and GIT for providing common platform for all the developers and Bug Tracking Tool like (JIRA).
  • Experience with database development using database engines like Oracle, MySQL.
  • Expertise in using Application Servers like Web Logic and Apache Tomcat.
  • Articulate in written and verbal communication along with Strong interpersonal, analytical and organization skills.
  • Highly motivated Team player with the ability to work independently and adapt quickly to new and emerging technologies.
  • Good at communicating and presenting models to business customers.

TECHNICAL SKILLS

Big Data: HDFS, MapReduce, Hive, Pig, Sqoop, Flume, Oozie,HadoopStreaming, Storm,Spark, Kafka, YARN, Zookeeper, HBase

Languages: Java, Python, C, Scala, SQL, PL/SQL, Shell ScriptWebservices, Servlets, JavaScript, JDBC

Frameworks: Spring, Hibernate, JMS

Web Technologies: AngularJS, jQuery UI, Ajax, HTML5, CSS3, RESTful Service, JavaScript, Bootstrap, JSON, XML, Web Service, SOAP, JAXB

Database: Database Oracle 10g, 11g

Web Servers: Apache Tomcat, WebLogic, IBM WebSphere, Jboss

Build Tools: Jenkins, ANT

Reporting Tools: Jasper Reports, Crystal clear, Tableau

PROFESSIONAL EXPERIENCE

Confidential, Denver, CO

Sr Software Engineer

Responsibilities:

  • Worked autonomously within a team of Data Analysts, to analysis, review, update, edit, clean, translate, and ensure accuracy of customer data.
  • Responsible for building scalable distributed data solutions usingHadoop
  • Wrote the Map Reduce jobs to parse the web logs which are stored in HDFS.
  • Importing and exporting data into HDFS and HIVE, PIG using Sqoop.
  • Managed data coming from different sources and monitored the running MapReduce programs on the cluster
  • Load data from UNIX file systems to HDFS. Installed and configured Hive and also written Pig/Hive UDFs.
  • Created Hive Tables and wrote Hive queries which will invoke and run MapReduce jobs in the backend.
  • Implemented the workflows using Apache Oozie framework to automate tasks.
  • Worked with NoSQL databases like HBase in creating HBase tables to load large sets of semi structured data coming from various sources.
  • Loading Data into HBase using Bulk Load and Non-bulk load.
  • Developed scripts and automated data management from end to end and sync up b/w all the clusters.
  • Explored with theSparkimproving the performance and optimization of the existing algorithms inHadoop.
  • Imported the data from different sources like HDFS/HBase into SparkRDD.
  • Installed and configured Hive, Pig, Sqoop, Flume and Oozie on theHadoopcluster.
  • Involved in converting Hive/SQL queries intoSparktransformations usingSparkRDD, Scala and Python.
  • Followed agile methodology for the entire project; Prepared technical design documents, detailed design documents.
  • Experienced in managing and reviewingHadooplog files.

Environment: Hive, HBase, Flume, AngularJs, Pig,Spark, Oozie, Oracle, Yarn, GitHub, Junit, Tableau, Unix, Flume, Sqoop, HDFS, Tomcat, Java, Scala, Python

Confidential

Sr Software Engineer

Responsibilities:

  • Involved with the application teams to installHadoopupdates, patches and version upgrades as required.
  • Developed Simple to complex Map/reduce Jobs using Hive and Pig.
  • Worked on analyzing, writingHadoopMapReduce jobs using JavaAPI, Pig and Hive.
  • Created HBase tables to store variable data formats of data coming from different portfolios.
  • Involved in Configuring core-site.xml and mapred-site.xml according to the multi node cluster environment.
  • Worked onHadoopeco system components HDFS, MapReduce, Hive, Pig, Sqoop and HBase.
  • Used Flume to collect, aggregate, and store the web log data from different sources like web servers, mobile and network devices and pushed to HDFS.
  • Wrote Hive Queries for analyzing terabytes of customer data from HBase and put the results in output file.
  • Developed simple and complex MapReduce programs in Java for Data Analysis on different data formats
  • Experience in using Flume to efficiently collect, aggregate and move large amounts of log data.
  • Was responsible for importing the data (mostly log files) from various sources into HDFS using Flume
  • Created Oozie workflow engine to run multiple Hive jobs.
  • Extensively Used Sqoop to import/export data between RDBMS and hive tables, incremental imports and created Sqoop jobs
  • Created Oozie workflow engine to run multiple Hive jobs.

Environment: Hadoop, Cloudera (CDH 4), HDFS, Hive, HBase, Flume, Sqoop, Pig, Kafka Java, Eclipse, Tableau, Talend, Ubuntu, UNIX, and Maven.

Confidential

Sr. Software Engineer

Responsibilities:

  • Involved in Development and Design for multiple modules.
  • Used Spring framework as middle tier component and integrated with Hibernate for back-end development.
  • Responsible for the design and development of J2EE based application.
  • Developed UI using HTML5, JavaScript, JQuery, JSP, and Ajax.
  • Developed the web tier components using HTML, CSS, JSP and JavaScript.
  • Extensively participated in application integration testing.
  • Involved in coding and daily build and deployment on server.
  • Introduced new mechanism Reading data from webservice.
  • Prepared Detailed design documents.
  • Used the J2EE and JSP for front end server side.
  • Used Restful and SOAP Webservices to integrate with Other Products.
  • Used Java Mailing for mailing Alerts and notifications.
  • Used JMS for data transmission.
  • Developed SQL queries, triggers for Scheduling mechanism.
  • Used Apache POI for XL Download for Logger information.
  • Jboss, WebLogic, Oc4j, IBM WebSphere as application server.
  • Used Oracle as database and Involved in the development of PL/SQL backend implementation and using SQL created Select, Update and Delete statements.

Environment: Java, J2ee, Webservices, Log4J, Eclipse IDE, WebLogic 10, Ant, SQL Developer, ORACLE.

Confidential

Software Engineer

Responsibilities:

  • Involved in the requirements analysis, use case development and design of the application
  • Created the Entities for the database objects, Custom Entities and tables using Entity Framework.
  • Developing Web forms and windows forms for concerned applications using Angular JS.
  • Building web applications with Java Spring MVC, JavaScript, jQuery, JSON, HTML, XHTML and CSS.
  • Developed prototypes using HTML, Bootstrap and CSS and implement feedback from business users
  • Migrated SQL Server database to Windows Azure SQL Database and updating the Connection Strings based on this.
  • Built Single Page Application using AngularJS, Java Script, JQuery with Bootstrap styling.
  • Designed the application using an MVC (Model View Control) Architecture that promotes clear separation of the presentation, business logic and data access tiers.
  • Used GIT for source code management.
  • Used Cascading Style Sheets (CSS) and Bootstrap to apply styles to List View and some other tables in dashboards.
  • Used Kendo UI, Bootstrap, and JQuery libraries for more productive web design.
  • Created databases and schema objects including tables, indexes and applied constraints, connected various applications to the database and written functions, stored procedures and triggers using SQL Server.
  • Created RESTful web services to communicate with backend.
  • Involved in performance optimization and tuning.
  • Developed and maintained ETL (Data Extraction, Transformation and Loading)to extract the data from multiple source systems that comprise databases like Oracle 10g

Environment: Java, J2ee, Spring, Hibernate, JSP, Eclipse IDE, WebLogic 10, SQL Developer, ORACLE

Confidential

Software Engineer

Responsibilities:

  • Developed various product applications using Java, J2EE and related technologies.Prepared Detailed design documents.
  • Involved in development of middle layer business methods, which incorporated the core business functionality using Singleton Beans.
  • Involved in various phases of Software Development Life cycle (SDLC) of the application like requirement gathering, Design, Analysis and code development.
  • Developed user management screens using AngularJS, business components using Spring framework and DAO classes usingJPA framework for persistence management and involved in integrating the frameworks for the project.
  • Used the J2EE and JSP for front end server side.
  • Greater control on the work-time of the fleet on street
  • Better planning, scheduling and prioritization of activities to be performed by the collection executives
  • Availability of detailed information about activities performed by the field executives would help Magma with various analyses and planning for efficiency increases.
  • Involved in Development phase of FIs and Tele Caller customer details maintenance.

Environment: Java, J2ee, Spring, Hibernate, JSP, Eclipse IDE, WebLogic 10, SQL Developer, ORACLE

We'd love your feedback!