We provide IT Staff Augmentation Services!

Sr. Architect Resume

4.00/5 (Submit Your Rating)

Manhattan, NY

SUMMARY:

  • A senior IT professional with over 12 years of experience in analysis, design, development and implementation of large scale web - based applications using Core Java, J2EE, Hadoop, Map-Reduce, Spark, Storm, Kafka, Phoenix, HBase, Redis, AWS and related Technologies
  • Confidential Certified Cloud Computing Solution Architect on Amazon Cloud (AWS)
  • Hands on Architect, led Design and development work on projects, focus on mentoring teams technically while working on application development
  • Extensive work on NoSQL technologies, particularly Datastax Enterprise (Cassandra) and Hbase
  • Implemented IoT use case for large-scale sensor data streaming in from sensors. Familiar with handling time series data on Cassandra as well as Hbase and OpenTSDB
  • Thorough understanding of Data Warehouse and ETL concepts
  • Experience in Hadoop, Storm, PIG and LINUX Shell Scripting
  • Extensive work with Confidential WebSphere application servers. Extensive programming experience in Core Java, J2EE using Web Sphere Application Developer (WSAD 5.x) and application developer tools
  • Experience in writing, diagnosing, performance tuning of Queries and Stored Procedures in MySQL, DB2. Experience working with Oracle 10.x database, including SQL (DML, DDL Queries)
  • Extensive experience on CVS, Clear Case and SVN for Source Control
  • Extensive experience in working with Maven, Jenkins
  • Extensive experience developing programs in MapReduce, Spark and Scala
  • Good knowledge of Visualization tools, particularly Tableau and Kibana
  • Strong understanding of Security on Hadoop across both CDH and HDP implemenations
  • Extensive experience designing and implementing Web Applications
  • Extensive experience working closely with Business Teams Onsite, coordinating with offshore
  • Experience in implementing projects using Waterfall, RUP, Agile Methodologies and exposure to SCRUM project implementation methodology
  • Experience in leading and managing teams, particularly go-live and Production Support teams
  • Vast experience working on all phases of System Development Life Cycle (SDLC) including, but not limited to Design, Development, Testing, Implementation, Rollout and Production Support
  • Excellent analytical ability, Project Management and Presentation skills
  • Good leadership qualities, excellent communication and interpersonal skills
  • Quick learner of business processes, Industry experience across Financial Services, Energy, Public Sector, Healthcare verticals

TECHNICAL SKILLS:

Programming Languages: J2EE, MySQL, HTML, Java Script, UNIX Shell Scripting.

Big Data Technologies: Hadoop1.0.1, MapReduce, Pig, HBase 0.94.8, Cassandra 2.0.2, Sqoop, twitter4j, Storm 0.9.1, Spark, Scala, nutch, Solr, AWS

Application Servers: Web Sphere 5.x.

Content Management: Documentum 5.3, Confidential, Confidential 6.1

Source Control / Build Tools: CVS, JIRA, Clear Quest, Clear Case SVN, Apache Maven 2.2.1.

J2EE / Frameworks: Spring MVC, Struts, Log4j

IDE Tools: Eclipse, WebSphere Application Developer (WSAD - 5.x), Rational Application Developer (RAD)

Databases: MySQL 5.x, Oracle, DB2.

PROFESSIONAL EXPERIENCE:

Confidential, Manhattan, NY

Sr. Architect

Environment: Java, Cloudera Hadoop 5.4 (Yarn), Map-Reduce, HBase, Hive, Linux, Agile-Scrum, SVN, Maven, Impala, Storm, Kafka, Tableau, Jira, Oozie, Spark

Responsibilities:

  • Primary responsibility for implementing streamlined devops processes
  • Help teams transition old code from map-reduce to Spark

Confidential, Houston, TX

Solution Architect

Environment: Java, Hadoop (Yarn) Hortonworks HDP 2.3, Map-Reduce, Spark and Storm streaming, Jenkins

Responsibilities:

  • Took over as lead for Phase 1 from another resource rolling into a different project at Confidential
  • Led the team through UAT and production rollout. Resolved issues quickly to satisfy exit criteria
  • Designed new feature requests for improvements to Phase 1 implementation using Storm, Spark and Kafka and Cassandra
  • Built data processing flows on Spark with Scala
  • Tasked with exploring newer processing frameworks including Hortonworks Dataflow (based on Apache Nifi) and Apache Samza
  • Implementation of Datastax Enterprise
  • Extensive work on Cassandra data modeling, Column family design
  • Built code using Datastax Java Driver, ported older code from ThriftAPI clients
  • Used the Spark - Cassandra Connector to load data to and from Cassandra
  • Monitoring Cassandra cluster for resource utilization.
  • Knowledge of Cassandra systems backup and restore processes, incremental backup, snapshots (SSTables)
  • Knowledge of Cassandra security
  • Knowledge of Cassandra maintenance and tuning - both database and server
  • Understand existing codebase for loads into Cassandra and target performance improvements
  • Planning and buildout of new development environment for phase 2. Created project charter with initial timelines, planned data sources and volumes, streaming requirements, key decision factors and risks, security requirements
  • Dev Cluster installation of Hortonworks
  • Security using Kerberos and ACL, Ranger, Knox and AD integration
  • Ambari for operations management
  • Initial data ingestion using Sqoop and real-time pipeline using Kafka, Storm, HBase
  • Built Well reference data and sensor data models for handling large volumes of sensor data. Created design to replicate process in Hadoop
  • Extensive work on Jenkins for Continuous Integration

Confidential, Manhattan, NY

Sr. Developer/Architect

Environment: Java 1.7, Hadoop Hortonworks HDP 2.2.0 (Yarn), Map-Reduce, HBase-0.98.0.1, Hive, Linux Centos 5.x, Agile-Scrum, SVN, Maven, Apache Phoenix, Storm, Kafka, Tableau, SAP Business Objects, Jira, Squirrel, Ambari, Falcon, Oozie, Spark 1.4.1

Responsibilities:

  • Architected solutions, worked directly with business partners discussing the requirements for applications. Designed and implemented the solution for Settlements and entitlements management on Hadoop
  • Analyzed the requirements in existing program in Core java
  • Designed the SEMS to include different data Sources for Snapshots and deltas using Storm
  • Used Sequence Diagrams to explain flow of the Data in Big Data system
  • Wrote a Producer and consumer program to send data in a message queue using Kafka
  • Built the initial POC using Storm Topology, Bolts and spouts
  • Dev Cluster installation of Hortonworks
  • Security using Kerberos
  • Ambari for operations management
  • Initial data ingestion using Sqoop and real-time pipeline using Kafka and Storm
  • Wrote Hive queries to query the data in HBase.
  • Wrote Java program to bulk upload data in HBase
  • Wrote extensive shell scripts for automation
  • Designed Queries in Phoenix using indexes for faster retrieval of Data from HBase.
  • Wrote multiple queries in Phoenix to pull data from HBase
  • Managed the project based on Agile-Scrum Methods
  • Worked on and mentored team using technologies like Hadoop, HBase, Storm, Kafka, Phoenix
  • Built POC for getting data from Phoenix using Spark, and use of Flat files to dump data in Spark
  • Querying using SQLContext for getting results as per the customer’s sql query
  • Implemented Social Listening platform using Spark Streaming as well as ELK stack (ElasticSearch, LogStash and Kibana)
  • Implemented PoC for large-scale time series data using Kafka, Storm, Hbase, OpenTSDB and Tableau

Confidential, NY

Sr. Developer

Environment: Java 1.7, Hadoop (MapR), Map-Reduce, HBase-0.94.8, Hive, Linux Centos 5.x, Agile-Scrum, SVN, Maven, Ab-initio, Visio 2007, Cassandra, Solr AWS

Responsibilities:

  • As a Senior Developer, worked directly with business partners discussing the requirements for applications
  • Analyzed the requirements in Ab-initio and modeled the requirements in Visio 07
  • Wrote map-reduce program in java for data processing
  • Wrote extensive shell scripts to run appropriate programs
  • Wrote hive queries to confirm the records from HBase
  • Wrote program to bulk upload data in HBase
  • Wrote multiple queries to pull data from HBase
  • Wrote Java to pull related data from HBase
  • Configured and integrated Apache Solr for full text search. Implemented pre-built and custom Solr Response writers for access from multiple enterprise applications
  • Reporting on the project based on Agile-Scrum Method.
  • Worked and mentored team using technologies like Hadoop, HBase
  • Experience on AWS cloud services like EC2, S3, RDS, ELB, and EBS for installing, configuring and troubleshooting on various Amazon images for dedicated environments
  • Experience with Load Balancers on AWS
  • Worked on new project on Datastax enterprise (Cassandra, Solr)
  • Built development environment on AWS
  • Created project charter, success criteria for PoC stage and defined requirements
  • Identified data sources and re-built original solution on Cassandra
  • Performed testing of writes and reads for documented high-value use cases

Confidential, Raleigh, NC

Developer

Environment: Java 1.7, Hadoop Hortonworks 1.x, Map-Reduce, HBase-0.94.8, Storm 0.9.1, Linux Centos 5.x, Agile, SVN, Maven, Jira, Apache Kafka

Responsibilities:

  • Lead and managed team during Design, Development and Implementation phase of the application.
  • As a Developer, worked directly with business partners discussing the requirements for new projects and enhancements to the existing applications.
  • Wrote Java code to process streams for risk management analysis.
  • Wrote extensive shell scripts to run appropriate programs
  • Wrote multiple queries to pull data from HBase
  • Reporting on the project based on Agile-Scrum Method. Conducted daily Scrum meetings and updated JIRA with new details.
  • Wrote Java to pull related data from HBase.
  • Worked and mentored team using technologies like Hadoop, Storm to process all as Stream process.

Confidential

Architect/Developer

Environment: Java 1.7, Hadoop 1.0.1, Map-Reduce, HBase-0.94.8, Apache NLP, Apache Nutch 1.6, Eclipse, twitter4j, Linux Centos 5.x on AWS

Responsibilities:

  • Wrote extensive shell scripts to run appropriate programs
  • Reporting on the project based on Agile-Scrum Method. Conducted daily Scrum meetings
  • Wrote multiple java programs to pull data from HBase
  • Involved with File Processing using Pig Latin
  • As a developer, involved in the development of code as per the requirements without compromising on the time lines and the quality of the code
  • Unit Tested, the developed code and made sure that code is bug free as per the standards.
  • Performed Integration and system testing the application and made sure that the application is implemented and is successfully integrated with other existing systems.
  • Evaluating, prototyping and recommending various development tools and server platforms for the department and also mentoring team members and designing “best-practices” documentation for the related technologies

Confidential

Solution Architect

Environment: Eclipse, LiveLink (CS10), IIS 7.0, Visio

Responsibilities:

  • Evaluating, prototyping and recommending various development tools and server platforms for the department and also mentoring team members and designing “best-practices” documentation for the related technologies.
  • Developed the Class and Sequence diagrams (UML Diagrams) as per the requirement documents using Rational Rose.
  • Was responsible to make sure that the team is developing the application with the given timelines without compromising on the requirements and the quality of the code.
  • As a delivery engineer am responsible to make sure the code is build and compilation error free and the code is delivered in time with permissible defects.

Confidential, Denver, CO

Sr. Java/ Content Management Developer

Environment: Java 1.5, AIX, Websphere 5.1, Eclipse, Documentum 5.3, Documentum 6.5, Confidential 3.0, Confidential 5

Responsibilities:

  • Using Documentum Api, writing code using java 1.5, on different layers of Documentum like Webtop and DFC
  • As a developer, involved in the development of code as per the requirements without compromising on the time lines and the quality of the code using Java, J2EE, Oracle Database and WSAD IDE
  • Support users using System Administration tools of Documentum (DA) & Websphere 5.1
  • Daily verification of the Documentum based Applications availability.
  • Installing Documentum Related Components like Full Text Index, DTS on AIX.
  • Leveraged Confidential API to write programs in Java
  • Information retrieval using Lucene
  • Configuring and deploying Java Application.
  • Maintaining and support of the existing application
  • Production support done using Remedy Tickets
  • Lead Root Cause Analysis activities to successfully identify root causes of incidents.
  • Assessment & Estimation of the change to be made in the application
  • Is responsible of application system upgrades, deployment of new releases on production environment.
  • Develop and maintain Application Support standards and procedures and other documentation as required.

Confidential

Technical Team Leader, Sr. Java / Content Management Developer

Environment: Java 1.5, Weblogic 10, Eclipse, Confidential 3.0, Confidential 5

Responsibilities:

  • As a developer, involved in the development of code as per the requirements without compromising on the time lines and the quality of the code using Java, J2EE, Confidential 3.0, Oracle Database and WSAD IDE.
  • Leveraged Confidential API to write programs in Java
  • Configuring and deploying Java Application in Confidential .
  • Working with business partners actively managed daily and weekly meeting for work status and support work progress.
  • Lead and managed team during Design, Development and Implementation phase of the application.
  • Working on Development on Java on Confidential .
  • Extensively used the Weblogic Server for application development and production.
  • Used Rational Application Developer IDE for development and debugging
  • Developed the code following all the standards and as per the requirements without any slippage in the time lines.
  • Information retrieval using Lucene.
  • Unit tested the developed code and made sure that code is bug free as per the standards.
  • Performed Integration testing of the system and made sure that the application is implemented and is successfully integrated with other existing systems.

Confidential, Piscataway, NJ

Sr. Java/CMS Developer

Environment: J2EE, Documentum, Java, XML, XML Schema, Websphere Application Developer 5.1

Responsibilities:

  • Developed the design documents based on the requirements documents provided using Java, J2EE, Documentum.
  • As a developer, involved in the development of code as per the requirements without compromising on the time lines and the quality of the code using Java, J2EE, Documentum 5.3, Oracle Database and WSAD IDE.
  • Developed Java Utility classes to support Date conversion, Date conversion and XML parsing
  • Leveraged Documentum API to write programs in Java
  • Developed the code following all the standards and as per the requirements without any slippage in the time lines.
  • Maintaining and support of the existing application
  • Production support done using Remedy Tickets
  • Lead Root Cause Analysis activities to successfully identify root causes of incidents.
  • Assessment & Estimation of the change to be made in the application
  • Unit tested the developed code and made sure that code is bug free as per the standards.
  • Performed Integration testing of the system and made sure that the application is implemented and is successfully integrated with other existing systems.

Confidential, Newark, NJ

Sr. Java Developer

Environment: J2EE (Java 1.4, JSP), LDAP, DB2, WSAD 5.1

Responsibilities:

  • Implemented J2EE Design Patterns such as Business Delegate, Front Controller, MVC, Session Facade, Value Object, DAO, Service Locator, Singleton.
  • Lead Root Cause Analysis activities to successfully identify root causes of incidents.
  • Assessment & Estimation of the change to be made in the application
  • Implemented the database connectivity with JDBC to the database on DB2 .
  • Implemented the server side processing using Java Servlets.
  • Implemented LDAP and roles base application security
  • Mentoring team members to develop use cases diagram, sequence diagrams, preliminary class diagrams using UML and for development of flexible, scalable and efficient java code.
  • Maintaining and support of the existing application
  • Production support done using Remedy Tickets
  • Created development and test environment in WebSphere 5.1 and Apache Tomcat 4.1 web server.
  • Actively involved in the integration of different use cases, code reviews
  • Created development and test environment in WebSphere 5.1.1.
  • Client Interaction to get the requirements from the end user.
  • Customizing application on J2EE to suit user’s requirements
  • Performing enhancements on the application on the user’s requirements
  • Maintaining the application using J2EE Components like Websphere
  • Querying the database using SQL for related results

Confidential

Sr. Java Developer

Responsibilities:

  • Involved in running the PS Delivered Application Engine process for data conversion from our current version to the newer version
  • Involved in maintenance of HR, Benefits, Payroll and eApps
  • Created new and customized existing Record definitions, Pages, Components for the client specific functionality
  • Configured and maintained all the tables involved in Payroll processing, such as Pay calendar, JobCode, Deductions and Processes such as Paysheet creation, PayCalc, Payconfirm, Tax Tables etc
  • Customized and modified ePerformance, eCompensation, eRecruit pages upon user requests
  • Built custom configuration pages to the users to be able to add items that are used in ePerformance documents

We'd love your feedback!