We provide IT Staff Augmentation Services!

Application Architect Resume

5.00/5 (Submit Your Rating)

Addison, TX

SUMMARY:

  • 10 plus years of Software experience covering the complete software development life cycle (SDLC) by analyzing, designing, documenting, coding, Integrating, testing, Implementation and user training of various web applications in 2 - Tier and n - Tier Architecture with emphasis on Object Oriented Design (OOD) & Methodologies (OOPS).
  • Around 4 years of BigData experience Confidential & Confidential and Confidential .
  • Over 10 plus years of experience in Java, J2EE, OSGi containers.
  • Around 0.5 years of experience in Confidential & Confidential e-Strategy division in Telecom industry.
  • Around 12 years of experience in BSS division in Telecom industry.
  • Over 3 years of experience in Banking industry
  • Over 1.5 years of experience in Capital Market industry.
  • Over 10 years of experience in customer facing.
  • Cloudera Certified Developer for Apache Hadoop (CCDH4)
  • Databricks Certified Developer for Apache Spark
  • Involved in requirement gathering, architecture development, design, development and deployment of solutions built on the Hadoop platform.
  • Very Good knowledge on R, Sas and Mahout Technologies.
  • Good knowledge on Data Warehousing/DataMart.
  • Worked in Telecom Billing, Capital, Banking Services and Web/Online Services.
  • More than 10 years of Confidential experience in both Ensemble and Enabler ( Confidential Telecom Billing Products). This involves the work for the Confidential clients: Confidential & Confidential (USA), Far Eastone (Taiwan), Excelcom (Indonesia), Western Wireless (USA), Centurylink (USA), Research & Development projects.
  • Experience in writing the documents like HLD (High Level Design), DD (Detail Design), AID (Application interface design), IDT (Interface Design Document), TRN (Technical Release Notes) and User Guide.
  • Experience in working as a Business Architect (BA) in telecom industry.
  • Executed code inspection tools like Confidential ACIT and Confidential & Confidential CAST and made sure that code is complaint with the standards set by client Confidential & Confidential and Confidential .
  • Demonstrated competence as a good team player with a reputation for integrity and ability to handle multiple projects. Got anonymously very positive feedback from all the managers sofar worked with. Very Consistent performance and always part of top 5-10% of the employees in terms of performance.
  • Design and Developed many tools which are used by production support groups, infrastructure teams and dev teams.

TECHNICAL SKILLS:

Operating Systems: MS-DOS, UNIX (HP & Sun Solaris) and Windows XP/95/ 98/2000/NT.

Language: Pascal, PERL, C, C++ and Java (jdk, java docs).

RDBMS: MS Access, MS SQL Server, PostGresSQL9.2, Oracle, AS400

GUI: Visual Basic and Developer/2000.

Web Technologies: EJB, JSP, Servlets, JMS, JavaBeans, Spring, Struts, hibernate, Ant, log4j, SOAP, HTML, DHTML, CSS (Cascading Style Sheets), XML, DTD (Document Type Definition), XML Schema (XSD), XSL, XSLT, WSDL, XMLC, JAXP (Java API for XML Processing), SAX (Simple API for XML), DOM (Document Object Model), Java Script, VBScript., Apache ServiceMix (OSGi), Apache CXF

Web/App Servers: Oracle WebLogic Server 5.1/6.1/8.1/9 .2 , WebSphere and Tomcat.

Tools: MS Project, Rational Rose, Microsoft Visio, CVS, PVCS, SVN, Mercury Test Director, HP Application Lifecycle Management (ALM) 11.0, webtrax, iTracks, Load Runner, JBuilder, Visual Studio, Eclipse, Tuxedo, TOAD, Crystal Reports, Dream Weaver, Adobe Photo Shop, Flash, XMLSPY and Ultra Editor, RSyslog

Telco Tools & Applications: Confidential Studio, xtra-C, Confidential Code Inspection Tool (ACIT) for Java, ACIT for C++, Method Invoker (MI), SONAR & SONAR Template Builder, Fox, Tiger, AMC, TRB, UAMS, Jutil, JSP Infra, EJB Infra, Enabler-CM, Enabler-A&F, Enabler-AR. Enabler-ProductCatalog, Ericsson Velocity Studio 14.2

DataWarehousing: Hadoop Eco System 2.0, HDFC, MapReduce, Pig, Hive, Hbase, Cassandra, zookeeper, Sqoop, Flume, NoSql, Oozie, Informatica 8.6 (ETL), Spark, Cassandra, Python2.6.6, Yarn, HDP2.1, HDP2.2, Talend, MongoDB, Kafka, Storm, Splunk, Tez, R, SAS, Mahout, Splunk/Hunk, Platfora, Datameer, Ranger, Knox, AWS, Amazon EC2, Amazon S3

WORK EXPERIENCE:

Confidential, Addison, TX

Application Architect

Responsibilities:

  • Designed and delivered Dodd Frank retention compliance application ( UDAS ) allowing clients to stream archives using multi tenancy Play Web Services and servlets
  • UDAS Search portal developed using Play Framework, eBeans , JDBC and JSP
  • Authentication is done using LDAP Active Directory lookup and authorization through custom logic against DB.
  • Involved with Bofa automated defensive infrastructure ( AIMS ) using RESTful web services, including multi-threaded processing for high volume batch request mechanism .
  • AIMS developed using Struts2, Tiles framework, spring core, Spring AOP, Hibernate.
  • Analyzed bank wide Privileged access control objectives and impact on the applications supported.
  • Design and development done for storing all applications entitlements into Centralized secured database (CSDB) for reporting, dormancy and revocation.
  • Analysis and implementation of Multi Factor authentication (MFA) support for interactive applications through CA SiteMinder
  • Development done for logging, reporting and alerting using bank proprietary ECSL tool.
  • Used RSyslog utility to feed application logs in real time to ECSL logging system.
  • Providing management on overall timeline on the priv access objectives and making sure to work with the team to deliver the objectives on time.
  • Interfacing third party archiving tools like EMC Centera for Un structured data, IBM Infosphere optim Archive for structured data and Data Insight (DI)/Electronic volt (EV).
  • POC and implementation done to interface with storage system called Scality S3 which is AWS S3 implementation.

Environment: JDK1.7/1.8, Oracle 9i, SQL, Tomcat8, Struts2 MVC Framework, Tiles Framework, Jerse y Rest Framework, Play MVC Web Framework, JDBC, Quartz Scheduling, JPA2, Hibernate3, Spring core3, Spring data, Eclipse Kepler, SVN, Unix, EMV Centera SDK, IBM Optim, git, subversion, Unix

Confidential, Irving, TX

BigData Architect/Sr.Developer

Responsibilities:

  • Involved with the designing of data pipelines starting from extraction to visualization.
  • Involved in requirement gathering, architecture development, design, development and deployment of solutions built on the Hadoop platform
  • Cluster setup done in an internal cloud similar to AWS (Amazon EC2)
  • Implemented using Cloudera (CDH 4.5) distribution.
  • Prototyping the solution and demonstration to CITI internal client.
  • Developed the code and unit testing.
  • Data Extraction is done using Sqoop to load from Oracle DB to Data lake (Big data) platform.
  • Data is stored in bigdata hive db.
  • Employee Connections Analysis and visualizations done using Business intelligence (BI) tool Datameer.
  • Used Datameer workbook to load filtered tables data and run transformations such as join, groupby, count, concat etc.
  • Used Datameer Business infographic designer to design the visual reports.
  • Used Cloudera Manager to monitor the Hadoop echo system.
  • Used Hue to interact with the Hadoop platform.

Environment: Hadoop Cloudera CDH4.5, HDFS, Hive 0.13, Sqoop0.8, Pig 0.12, Hue, Cloudera Impala, Java (jdk1.7 ), oracle 11g, PL/SQL, SQL*PLUS, UNIX Shell Scripting, Platfora2.6.5, Datameer 3.1.8, R

Confidential, Richardson, TX

BigData Architect/Sr.Developer

Responsibilities:

  • Evaluate big data technologies and prototype solutions to improve our data processing architectures
  • Build distributed , scalable and reliable data pipelines that ingest and process data
  • Implementation of complete Big data solutions including data acquisition , storage , transformation and analysis.
  • Involved in requirement gathering , architecture development , design, development and deployment of solutions built on the Hadoop platform
  • Involved with the upgrade of Hortonworks from 1.2 to 2.2.
  • Implemented projects dealing with considerable sizes i.e. 1.5TB/day
  • For each source, gathering the requirements and converting them into AID and HLD.
  • Review AID and HLD with BA, PM, Hadoop admin and Development Leads.
  • Providing volumetric reports to Hadoop admin to provide systems with required resources.
  • Prototyping the solution and demonstration to client Confidential & Confidential .
  • Developing the code and unit testing.
  • Used Ant to build the env for each source and used SVN as code repository.
  • Written Scripts in Unix Shell and Python.
  • Extracted Data is loaded to Datalake (Bigdata) platform from multiple sources like oracle, Teradata, Vertica and EDW using sqoop, file extracts and TPT extracts.
  • Data loading is done to stage and gold hive tables using pig, MapReduce, Hive and HCatalog.
  • Written UDF’s to use in Hive and also used UDF’s from Hadoop library.
  • Source files tracking are done in HBase tables & Zookeeper using Java API’s.
  • Attribute level encryption is done using Voltage api’s.
  • Jobs were scheduled using IBM Tivoli Workload Scheduler (TWS)
  • Used Hortonworks Ambari to monitor the Hadoop echo system.
  • Prototype done with HDP Kafka and Storm for click stream application
  • Trained in Spark and Scala.
  • Providing Deployment Guide and Runbook for Production support/Hadoop Admin groups.
  • Coordination with Hadoop Admin’s during deployment to production.
  • Involved in re-certification of projects after Hadoop upgrade from 1.3 to 2.2 (yarn)
  • Observation of system performance using Ganglia and Nagios on Dev & Prod servers.
  • Daily participation in agile standup call.
  • Involved with the development of User stores in each iteration.
  • Coordination and support with offshore developmen Confidential teams.
  • ST support

Environment: Hadoop Hortonworks 1.3/2.2, Map Reduce, HDFS, Hive 0.13, Sqoop0.8, Pig 0.12, Java (jdk1.7 ), Hadoop distribution of Hortonworks, TPLQuery, oracle 11g, PL/SQL, SQL*PLUS, Toad 9.6, UNIX Shell Scripting, Agile, Ant, Rally, HBase0.94, Kafka, Storm, zookeeper

Confidential, Dallas, TX

Solution Architect/Team lead

Responsibilities:

  • Involved with designing the new flows or defining interface with multiple external applications.
  • POC and demonstration done for Coherence caching setup.
  • POC and demonstration done for JUnit, PowerMock for unit testing.
  • Daily participation in Agile standup call.
  • Built and deployed the applications as OSGI bundles using Maven in the ServiceMix platform.
  • Involved with the development of User stores in each iteration.
  • Code review of other team member’s code.
  • Coding done using SOA technologies like Rest and Soap API’s using Apache CXF Framework..
  • Coordination and support with offshore developmen Confidential members.
  • ST/IST and E2E support and resolution of defects.

Environment: HTML5, CSS, Java script, Angular JS, Spring Core 3.2, Spring MVC, Spring security, Maven. Apache ServiceMix 4.5.3, OSGi, Weblogic 10.3, Rest Services and WebServices, XML, WSDL, SonarCube, Junit 4.11, Power Mockito 1.5.5, EclEmma, Hibernate, SOAPUI 5.0, SVN, Unix, Agile, Rally

Confidential, Addison, TX

Lead System Analyst

Responsibilities:

  • Production support focal point for multiple middleware modules i.e. Common services interface (CSI), Business Rules Management Services (BRMS).
  • Coordination with Dev team in planning the monthly releases and make sure that releases are deployed to production smoothly and there is no impact to end users.
  • Written Unix shell scripts to generate reports from error logs.
  • Handled message queues maintenance i.e. processing, deleting from the queues deployed on WebSphere.
  • Handled issues with SOAP and REST Webservices. worked with monitoring team in deploying the rules for alerts into production.
  • Maintained alerts escalation setup in NEWS (The Notification, Escalation and Wireless System)
  • Planning and maintenance of production for any adhoc requests from interface applications as part of maintance activities.
  • Access related activities for whole team is maintained through Nexus Requests.
  • Preparing weekend activities and setting up expectations for the team and making sure that team adhere to the planning.
  • By weekly meetings with Development to discuss on production issues, recommendations and defects.
  • Made sure that SLA’s defined by Bank interfaces are not breached.
  • Participated in designing production system architecture and fine tuning of web sphere configuration.
  • Defects (IM Tickets), Request for changes (RFC) are maintained in Maximo,
  • Defect fixes are verified before being implemented into production.
  • Participated in Design review meetings Maintained the defect fixes, merges, assignments in web based defect management tools like iTracks, Webtrax, SVN

Environment: WebSphere, IBM MQ Series, Java/J2EE, EJB, JDBC, Java Web Services, Intrascope, Shell scripting and many other Bank proprietary tools like AppWatch, OpsConsole, MRD, DNT, SVN, iTracks, Webtrax, News, Nexus, CA AutoSys 11.0, EMC ATMOS Cloud, Spring, jRules, iLog,Rest API on EMC ATMOS

Confidential, Richardson, TX

Development Expert (BigData)

Responsibilities:

  • Evaluate big data technologies and prototype solutions to improve our data processing architectures
  • Build distributed , scalable and reliable data pipelines that ingest and process data
  • Implementation of complete Big data solutions including data acquisition , storage , transformation and analysis.
  • Involved in requirement gathering , architecture development , design, development and deployment of solutions built on the Hadoop platform
  • Implemented projects dealing with considerable sizes i.e. 1.5TB/day
  • For each source, gathering the requirements and converting them into AID and HLD.
  • Review AID and HLD with BA, PM, Hadoop admin and Development Leads.
  • Providing volumetric reports to Hadoop admin to provide systems with required resources.
  • Prototyping the solution and demonstration to client Confidential & Confidential .
  • Developing the code and unit testing.
  • Used Ant to build the env for each source and used SVN as code repository.
  • Written Unix Scripts in Shell and python.
  • Extracted Data is loaded to Datalake (Bigdata) platform from multiple sources like oracle, Teradata, Vertica and EDW using sqoop, file extracts and TPT extracts.
  • Data loading is done to stage and gold hive tables using pig, MapReduce, Hive and HCatalog.
  • Written UDF’s to use in Hive and also used UDF’s from Hadoop library.
  • Source files tracking are done in HBase tables & Zookeeper using Java API’s.
  • Attribute level encryption is done using Voltage api’s.
  • Jobs were scheduled using IBM Tivoli Workload Scheduler (TWS)
  • Used Hortonworks Ambari to monitor the Hadoop echo system.
  • Providing Deployment Guide and Runbook for Production support/Hadoop Admin groups.
  • Coordination with Hadoop Admin’s during deployment to production.
  • Involved in re-certification of projects after Hadoop upgrade from 1.3 to 2.2 (yarn)
  • Observation of system performance using Ganglia and Nagios on Dev & Prod servers.
  • Daily participation in agile standup call.
  • Involved with the development of User stores in each iteration.
  • Coordination and support with offshore developmen Confidential teams.
  • ST support

Environment: Hadoop 1.3/2.2, Map Reduce, HDFS, Hive 0.13, Sqoop0.8, Pig 0.12, Java (jdk1.7 ), Hadoop distribution of Hortonworks, TPLQuery, oracle 11g, PL/SQL, SQL*PLUS, Toad 9.6, UNIX Shell Scripting, Agile, Ant, Rally, HBase0.94/Cassandra, zookeeper

Confidential, Richardson, TX

Development Expert

Responsibilities:

  • Customer facing and reaching to their expectations.
  • Worked with teams in onshore/offshore along with offshore dev teams scattered in USA, Israel, Cyprus and India.
  • Handling Confidential proprietary product’s i.e. Customer Management (CM), Accounts Receivable(AR), Transaction Broker (TRB), Confidential Monitor & Control (AMC) and Replenishment (RPL) modules
  • Supporting and resolving the integrated issues with CRM, OMS, Aquistion&Formatting (A&F), Event processing (EP), Invoicing (INV), Pricing (PC) and Confidential & Confidential /third-party proprietary software’s such as BDS, CSI, DUCS, Pitney Bowes GeoTax, eCDW, EDW, eTracs, CAPM.
  • High level and detail design reviews with offshore dev.
  • Online activities are supported through EJB’s, JMS and WebServices.
  • Coordinating with offshore dev teams on multiple deliveries for every release and ensure timely deliverables with quality
  • Supporting various onshore testing groups like Integrated System testing (IST), End to End testing (E2E), Performance Testing, User Acceptance testing (UAT), Regression Testing
  • Maintained the defect fixes, merges, assignments in web based defect management tools like Quality Center, HP ALM 11.0, iTracks, Webtrax
  • Made sure that SLA’s defined by Customer are not breached.
  • Participated in designing production system architecture and fine tuning of weblogic configuration.
  • Analyzing the Defects to identify the source of the defect (code / data) and fixing them or coordinating with offshore dev teams on the timely delivery of fixes. Made sure that SLA’s are not breached.
  • Involved in Integration of Enabler with att.com developed using ATG.
  • Involved in Integration of Enabler with Customer Relation Management (CRM) and Order management system (OMS)

Environment: Confidential CES 7.5, Confidential Customer Management (CM), Confidential Accounts receivable (AR), Confidential AMC, RPL, Oracle 10.2, Java/J2EE, JDBC, Weblogic 9.2, Toad 8.5, Java MQ Series, Java Web Services, Intrascope, Database View Manager (DVM)/Hibernate, OSGi

We'd love your feedback!