We provide IT Staff Augmentation Services!

Full Stack Architect/developer Resume

3.00/5 (Submit Your Rating)

Westport, CT

SUMMARY

  • 9+ years of experience as a Developer and a business system analyst in various industry domains like robotics, retail, finance, health care, Education and data technology.
  • Experienced and good understanding of teh various industry domains like Power, Robotics and manufacturing
  • HadoopEcosystem including Hive and Pig.
  • Excellent noledge onHadooparchitecture; as in HDFS, Job Tracker, Task Tracker, Name Node, Data.
  • Node and Map Reduce programming paradigm.
  • BigData frame work and Eco system wif evolving technologies.
  • C# .NET, ASP MVC 4, Razor, XML, Javascript, jQuery, SOAP and REST APIs
  • Extensive use of Ajax, Pub/Sub, Closures among other advanced features of Javascript
  • User Interface web development in accordance wif requirements
  • Experience wif distributed systems, large scale non - relational data stores, MapReduce, Systems, data modeling, and big data systems.
  • Knowledge of administrative tasks such as installing Hadoop and its ecosystem components such as Hive and Pig
  • Knowledge of NoSQL databases such as HBase, and MongoDB
  • In depth understanding/noledge of Hadoop Architecture and various components such as HDFS, JobTracker, TaskTracker, NameNode, DataNode and MapReduce concepts
  • Extending Hive and Pig core functionality by writing customUDFs
  • Good understanding of Data Mining and Machine Learning techniques
  • Experience in analyzing data using HiveQL, Pig Latin, and custom MapReduce programs in Java.AngularJS, JavaScript, or reusable web components
  • Installation experience in Hbase Oozie, Hive, PIG, Sqoop and Flume.
  • Hadoop Cluster Monitoring using Nagios and Ganglia Troubleshooting OS issues.
  • Installing safenet and Tripwire for data encryptions and security monitoring of systems.
  • Installation and configuring Squid Proxy,VNC, NFS, FTP,Rsync,DNS,DHCP Server and clients
  • Linux Kernel up gradation to add additional tune/capabilities to Linux server and optimizing Linux Server.
  • Hardening Linux Server and Compiling, building and installing Apache Server from sources wif min modules.
  • Experience in installing RPM's scripting in Bash,Ksh,Perl and usage of awk and sed.
  • Experience wif virtualization, VMware Administration (ESX/VSPHERE),
  • Experience in compiling source files and installing teh open source software in unix,Linux servers.
  • Installation of Tripwire to check teh integrity of teh file system on Linux System.
  • Task automation through Crontabs, Autosys, Maestro to invoke java programs and shell scripts.
  • Conducting Pre-prod meetings for requirements gathering and Post-production meeting forreviewing teh Prod deployments and re-mediating mistakes for teh smooth deployments in future.
  • Maintenance of technical documentation of installation, Upgradation and configuration changes andRunBook for teh production support.
  • Experience on Apache, Cloudera, HortonworksHadoopdistributions.
  • Experience in analyzing data using Hive QL, Pig Latin, and custom Map Reduce programs in Java.
  • Experienced in building strong websites confirming Web 3.0 standards like SOAP communication protocol using valid code and table free sites wif HTML, XHTML and XML.
  • Experience in Agile development environments using SCRUM/SPRINT based methodology.
  • Experience wif Java Scrip technologies Viz. JQuery, AJAX.
  • Extensively supported different types of BI tools such as digital dashboards, data mining, data warehousing.
  • Participated in teh Quality Assurance (QA) process and maintained teh traceability matrix to ensure requirements are consistent, comprehensible, traceable, feasible, and do conform to standards.
  • Participated in Bug-Review meetings wif software developers, QA engineers, managers and suggested enhancements to teh existed application from business perspectives, provided solutions to existing bugs.

TECHNICAL SKILLS

Languages: JAVA, .Net 4.0 and 3.5, C++

Database: ORACLE, SQL Server, Microsoft Access

Web Technologies: PHP, ASP, PERL, SHELL, HTML, DHTML, Drupal, Flash Script

IDE: Microsoft Visual Studio, Net Beans, Eclipse, IntelliJ

Operating Systems: LINUX, Windows and their server counter parts

PROFESSIONAL EXPERIENCE

Confidential, Westport, CT

Full stack Architect/Developer

Environment: HDFS, Hadoop, JAVA, UNIX, Jboss, High Charts

Responsibilities:

  • Developed MapReduce programs to parse teh raw data, populate staging tables and store teh refined data in partitioned tables in teh EDW.
  • Created Hive queries that halped market analysts spot emerging trends by comparing fresh data wif EDW reference tables and historical metrics.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into teh Hadoop Distributed File System and PIG to pre-process teh data.
  • Provided design recommendations and thought leadership to sponsors/stakeholders that improved review processes and resolved technical problems.
  • Managed and reviewed Hadoop log files.
  • Tested raw data and executed performance scripts.
  • Shared responsibility for administration of Hadoop, Hive and Pig.
  • Create robust and scalable designs for multiple features that meet business requirement.
  • Used Linux based DEV and UAT environments.
  • Ensure designs, code and processes are optimized for performance, scalability, security reliability and maintainability.
  • Create appropriate artifacts (white papers, video talks, etc.) to educate organization on features or underlying technology.
  • Security was crucial thus supported in development of DRM files for authentication.

Confidential - Bridgeport, CT

Full stack Architect/Developer

Environment: HDFS, Hadoop, JAVA, UNIX, Java High Charts, Java Spring Hibernate, HTML 5, PHP, SQL, HQL.

Responsibilities:

  • Developed a platform independent deployable product.
  • Create robust and scalable designs for multiple features that meet business requirement.
  • Built wrapper shell scripts to hold these Oozie workflow.
  • Developed PIG Latin scripts to transform teh log data files and load into HDFS.
  • Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing teh data onto HDFS.
  • Hands on experience wif NoSQL databases like Cassandra for POC (proof of concept) in storing URL's and images.
  • Developed hive UDF for functions that were not preexisting in Hive like teh rank etc.
  • Created External Hive tables and involved in data loading and writing Hive UDFs.
  • Experienced in implementing POC's to migrate iterative map reduce programs into Spark transformations using Scala.
  • Created concurrent access for hive tables wif shared and exclusive locking that can be enabled in hive wif teh halp of Zookeeper implementation in teh cluster.
  • Wrote teh shell scripts to monitor teh health check ofHadoopdaemon services and respond accordingly to any warning or failure conditions.
  • Developed Unit test cases using MR unit for map reduce code.
  • Involved in creatingHadoopstreaming jobs.
  • Involved in Installing, ConfiguringHadoopEco System, Cloudera Manager using CDH4 Distribution.

Confidential, PA

Sr. Data Analyst

Environment: HDFS, Hadoop, JAVA, UNIX, Java High Charts, Hippo CMS, Open DJ, Mongo DB and Groovy.

Responsibilities:

  • Developed data pipeline using Flume to ingest customer behavioral data and financial histories into HDFS for analysis.
  • Prepared teh best practices in writing map reduce programs and hive scripts.
  • Scheduled a workflow to import teh weekly transactions in teh revenue department from RDBMS database using Oozie.
  • Built wrapper shell scripts to hold these Oozie workflow.
  • Developed PIG Latin scripts to transform teh log data files and load into HDFS.
  • Used Pig as ETL tool to do transformations, event joins and some pre-aggregations before storing teh data onto HDFS.
  • Developed hive UDF for functions that were not preexisting in Hive like teh rank etc.
  • Created External Hive tables and involved in data loading and writing Hive UDFs.
  • Implemented POC's to migrate iterative map reduce programs into Spark transformations using Scala.
  • Created concurrent access for hive tables wif shared and exclusive locking that can be enabled in hive wif teh halp of Zookeeper implementation in teh cluster.
  • Wrote teh shell scripts to monitor teh health check ofHadoopdaemon services and respond accordingly to any warning or failure conditions.
  • Developed Unit test cases using MR unit for map reduce code.
  • Involved in creatingHadoopstreaming jobs.
  • Involved in Installing, ConfiguringHadoopEco System, Cloudera Manager using CDH4 Distribution.
  • Developed reports using teh SQL Server Reporting Services (SSRS).
  • Made extensive use of AJAX and teh AJAX Toolkit to improve performance of teh application in whole.

Confidential, Bridgeport, CT

Senior Researcher/ Developer

Environment: HDFS, Hadoop, JAVA, UNIX, Linux, C#, Aforge, Cygwin.

Responsibilities:

  • Worked wif students develop and deploy full scalable projects.
  • Worked wif students in teh development stages as well as teh deployment stages for final product and project deployments.
  • Designed teh web-based system for educational research analytics and data visualization inHadoopecosystem integrated Tableau onHadoopframe work to visualize and analyze data. Hive is used for data delivery.
  • Worked on several apachesHadoopprojects. Maps reduce programs were developed usingHadoop java API and also using hives and pig.
  • Designing and implementing solutions forHadoopAFMS system validation for all components of data analytics stack.
  • Worked wif scoop to import/export data from relational database toHadoopand flume to collect data and populate inHadoop.
  • Implemented and integratedHadoopbased business intelligence and Data Warehouse system including implementations of searching, filtering, indexing, aggregation for reporting and report generation and general information retrievalfrom data sets for research purposes.
  • MaintainedHadoopclusters for dev/staging/production. Trained teh development, administration, testing and analysis teams onHadoopframework andHadoopeco system.
  • Gave extensive presentations about theHadoopecosystem, best practices, data architecture in Hadoop.
  • Developed wireless sensor labs and programming modules for testing and developing new and efficient wireless algorithms.
  • Worked on Data mining algorithms for images provided by NASA, (currently implemented in teh Discovery Museum project).
  • Published 5 Journal and 11 conference papers related to my research and has been awarded teh best research paper in teh ASEE and many other conferences.

Confidential

Lead Java and UNIX full stack developer

Environment: JAVA, JDBC, HTML

Responsibilities:

  • Implemented first of its kind cloud based portal system in India.
  • Data sets and analytics were carried on UNIX servers running scripts.
  • Data segmentation was done on remote computers were clients information and ran on a simple servlet based programming model for information exchange.
  • Learnt a little CSR terminology.
  • Involved in Design of application using UML.Coding, Testing and Implementation of teh application.
  • Designed and developed process flow diagrams for both green field projects as well as Brown field projects.
  • Developed various index methods to halp calculate teh activity progress.
  • Involved in teh SRS (system requirement specification) development of teh product.
  • Managed teh marketing and sales of our product.
  • Development of documentation and manuals.
  • Development of custom programs and robotic kits.
  • Custom designing PCB’s.
  • Conducted workshops and classes.

We'd love your feedback!