Data Solution Resume Profile
3.00/5 (Submit Your Rating)
NJ
Summary
- 13 years experience in IT industry in complete project life cycles which includes Design, Development, Testing and Production Support.
- Extensive experience with providing end to end solutions to companies trying to get into big data. Dealing with all aspects of the system. Worked with Hadoop, Hbase, Sqoop, Storm, Kafka, advanced Map reduce, mahout and R with in a linux environment.
- Extensive experience with map reduce algorithms used for ETL.
- Extensive experience with analyzing data using complex statistical models in various clustering classification and recommendation algorithms built into mahout and Also using R. Used these algorithms for data mining, machine learning and forcasting.
- 6 years experience in architecture and development of numerous J2EE enterprise application implementations using Websphere CE, Weblogic Sip Server WLSS , Eclipse IDE, JBoss, Apache Tomcat.
- Extensive experience as Web Infrastructure/Linux/Unix/Solaris admin Installing, configuring and supporting development tools. Providing support to development teams.
- Extensive experience involving building application from scratch. Experienced in designing enterprise applications, developing and writing build scripts for build and deployment. Optimizing and writing test scripts for testing.
- Experience in Enterprise System Architecture and Object Oriented Analysis and Design techniques with design patterns.
- Extensive skills in deploying and maintaining systems in Elastic cloud computing and Virtualization environments for use in the SaaS business model.
- Experienced with Core Java, J2EE, Servlets, Spring MVC, Hibernate, Apache Camel, JSP, JSF, EJB3.0, XML/XSL, XSLT, Web Services, SOA, JDBC, JNDI, LDAP, JMS,SQL, Oauth, OpenID, SAML2.0 OpenSSO, PL/SQL, Stored Procedures and using Db2 udb, Oracle, SQL Server, and MySQL as databases.
- Several full software develop cycles SDLC .using the Scrum Agile project methodology.
- CONFIDENTIAL
- Worked on the Daas Data as a service data pile line platfrom moving from a traditional platform to moving to use big data technologies.
- The project involved both work from base location in New Jersey and working from the Dublin Ireland location. Also working with other teams in other locations.
- Worked on Cloudera 5 CHD5 for this project also deployment on amazon AWS.
- Worked on the architecture of all the technologies. And do a cost benefit technologies.
- Worked on Redis and Hbase as part of the No SQL data store in the data pipeline.
- Worked on pushing data from Hbase into Redis which is part of the vending mart which gives very fast in memory
- worked with zookeeper and curator to enhance capabilities of Redis
CONFIDENTIAL
- Worked on providing end to end Big Data solution with fashion data
- Installed and setup the Big data echo system which has never used hadoop before.
- Imported data from MySql, MsSql and Mongo DB, Importing data for MySql and Ms Sql was easy to do using Sqoop
- Since Sqoop doesn't support Mongo DB had to develop a MongoDB connector to both Hadoop and Hbase. In the process of adding more functionality to this connector and release it as open source project.
- Worked on aggregating the data using advanced aspects of Map Reduce.
- Worked through many iterations of aggregation and modeled the data in Apache Mahout to run analytical of this fashion data. Worked with a complex data set and modeled it in mahout
- Worked with fashion data to make recommendations for users based on both item and user similarity and also tried slop one based recommendations
- Worked on data for classification of different fashion Boards. Using mahout also worked with scala libraries
- Used R with Rstudio IDE for forecasting fashion trends using logistical Regression within the Rhadoop used various aspects of the R modeling and magnitude of packages available for the R language. Used various attributes from the data to model using R to come up with the implementation.
CONFIDENTIAL
- worked as the over all in charge of the whole project provided leadership guidance to other team members.
- worked on syncing Oracle RDBMS to hadoop DB Hbase while retaining oracle as the main data store.
- Worked on advanced aspects of Hbase Schema Design
- Imported data from Oracle tables to Hbase using sqoop
- Transformed the oracle tables into one big Hbase Table using java coding to implement this transformation.
- Also setup the whole Hadoop echo system from scratch including installing and configuring the various components to work together.
- worked on the design aspects of the solution including the new table schema and also the investigated using Hadoop as a real time solution.
- Looked into using spring event listners to trigger sycning of oracle data to Hbase DB.
- Worked on Real Time message queuing system for event logs using Apache Storm, Apache kafka as the broker and outputting the result into Hbase.
- Defining storm topologies with sprout and bolts for filtering to be used as input for Kafka messaging system.
- configuring kafka for both the producers and consumers of messages.
- Outputting the messaging queue from Kafka to the Hbase and designing the Hbase schema to best match the input data from Kafka.
CONFIDENTIAL
- Worked on a propriety risk analytical system to detect health care fraud. Used the Agile develop to re do stories to implement new features in the system.
- Worked with Junit and test out the stories to test out the stories.
- worked on ActiveMQ broker used in a fraud detection system where messages are exchanges between outside vendor and internal system and the failures are in monitored in the DLQ.
- worked with Apache camel to do the routes using java DSL for a system which needed integration with an external vendor
- worked with Akka for concorrency and scala coding
CONFIDENTIAL
- Worked on the platform as a service PaaS group for cloud realization. Worked with an external vendor with a platform similar to salesforce.com. This platform uses the multi Tenant architecture for customers to share resources.
- worked with Big Data platform Cloudera Hadoop CDH . Worked with different modules within Hadoop including HIVE and PIG to generate queries to the data.
- Wrote programs in java for MapReduce to help in sorting the data. Fed the MapReduce out to Hbase so a more structured output.
- Data analysis using Apache Mahout for collaborative filtering Recommendation engine ,clustering and other statistical and machine learning analysis methods for better predictions and forecasting.
- Intelligently integrating traditional structured data with unstructured data from web
- and social media. Worked with moving tables from Teradata to hadoop using Sqoop.
- Worked with revolution R and used
- worked with NoSQL database MongoDB and worked with it to perform many different operations and using various features including tagging and geo .
- Used SAML, OpenID and OAuth authentication and authentication in a web based single sign on SSO for different use cases. Used Apache Camel to do the OAuth dance routes and redirects.
- worked with SiteMinder to create a web single sign on SSO instance using SAML. Also used Tivoli and TSAM.
- Used Spring security libraries to implement authentication and access for form and method based application also wrote Java code with Spring LDAP libraries for authenticating users Stored on the LDAP server. Most of the work was done using Sing source libraries and deployed in vmware. All used the my source STS eclipse IDE.
- Wrote java code to extract usage elements which could be used for billing purposes. The coding involved setting up hibernate RMDB for usage elements. The usage elements were extracted using JDBC and Restful Web services. Oracle database 11g was used.
- Used Restful web services as a consumer from the PaaS platform tested out all the services using basic Url method and using testing using the poster plug in for fire fox also did a more detailed analysis using java code. Having authentication to login and then setting up sessions and passing the 'payload' for the POST method.
- Used Restful web services to gather usage information from the platform and used Xpath and Jdom for data manipulation. Processed data in both XML and JSON formats. There was also some authentication done using openSSO and SAML.
- For part of the time worked as the project lead conducted daily status meetings and coordinated efforts of different team members in both the cloud infrastructure and development teams. This project management task was overseeing the work of a team of seven technical professionals and assigning tasks to attain the maximum efficiencies.
- Created Design and architecture documents for the billing system and coordinated the effort across cross functional teams across this huge company.
CONFIDENTIAL
- This is a VOIP based communication solution and service for businesses using software as a service SAAS model. This is based on the asterisk PBX server running on cloud computing technology running on Redhat centos5.3 Linux. This required extensive use of J2ee to provide enterprise services to businesses. Performed the following tasks:
- Worked on using OpenID and Oauth1 bases authentication using SAML.
- Worked on OpenSSO with and LDAP to create way for this application to work in a SAML federated environment. Including writing code in SAML for authentication purposes.
- Used ServiceMix ESB and ActiveMQ to connect services in SOA environment. The services had access to Ejb, JMS messaging and able to persist the data into a database. Business orchestrations using BPEL. Also created web services using Apache CXF .
- Created a work flow engine using Bpel setting up both in an eclipse and in an Oracle SOA environment using Jdeveloper.
- Used ActiveMQ message broker to create a JMS pub/sub application. Created a multi broker configuration which ran offer multiple sites.Also used Apache camel to do the integration with various components.
- Designed and coded programs to work as multi threaded so they can run more efficiently.
- Used Lucene / Solr to do indexing of documents and used the Nutch webcrawler to go through specified pages. Basically using the Lucene Libraries and used in Solr application.
- Created a Voip server and added a platform for an IVR system added features associated with it including voice recognition, text to speech TTS .Installed, Created and integrated SQL database to the IVR. Also did socket programming for using the VoIP.
- Used SIP Servlet to implement a VoIP signaling diagram used for a conference calling application used it in conjunction to the weblogic Sip server WLSS which provides support the SIP protocol.
- Used UML class, use case and sequence diagrams to create design for applications and used these to develop code.
- Installed and configured Jboss with tomcat and EJB3.0 on it. Then developed an application with the EJB3.0 using entity java beans for business logic and message driven beans MDB in association with JMS to create pub sub and also point to point messaging.
- Worked with a client to create a J2ee application with using the VoIP, IVR and SMS server. This application uses Voip, IVR, SMS servers on an existing virtual container. This involved a full product software cycle. Worked with a cross functional team to carry out the project. From architecture definition right through implementation. Used JMS messaging using MDBs. Also used ActiveMQ
- Converted an existing J2ee Spring framework application to integrate Hibernate to have the benefits of ORM to make the the application code simpler. Wrote code for the XML mapping to the MySql and java code using the spring and hibernate Frameworks. Also responsible for writing stored procedures for the database, general maintenance , tuning and administration. Created Functions and stored procedures and also used Toad for testing the code quality.
- Used Apache CXF to expose methods within the classes into Web services. The Web Services were based on SOAP and were fully tested out for functionality.
CONFIDENTIAL
- Created a Web infrastructure and installed the Java development tools and configured the different tools to work with each other.
- Created a web based java application for vector generation system which can be used for Verilog verification. This used the Java Server Faces JSF Framework. Also installed all the associated tools including Netbeans, Ant, Glassfish application server and Tomcat. This application generates test vectors required for full chip logic testing. Built and managed mysql database to store standard tests which can be used universally. Also used the Junit framework to create a test driven methodology.
- Created custom tag libraries and used these with UI components. Also created navigation rules and used backing beans for validation, event handling and business logic
- Carried out RF design projects for companies. Worked on circuit blocks needed in the direct conversion receiver , zero IF and low IF topologies required for a fully integrated cmos solution. These circuit blocks included:
- the narrow band source degeneration LNA
- the wide band gate degeneration LNA
- Variable gain amplifiers VGAs to increase dynamic range of active RC filters
- All the designs were done in standard cmos process and familiarity with most wireless standards
- including gsm, bluetooth, UTMS, WLAN, Wimedia, WiMax, Zigbee, GPS and UWB
CONFIDENTIAL
- Lattice develops programmable logic devices PLD which can be configured to implement various functions. Lattice pioneered ISP In System Programmability .This allows the users to program EEPROM devices without removing them from the system board. Performed the following tasks
- Designed Bandgap reference circuit using the latest techniques of reducing device mismatch to generate a very stable voltage. This is implanted in 0.13u process.
- Worked on designing a PLL including the VCO, which can run as fast as 2 GHZ. Also worked on the charge pump. This PLL uses a regulated supply which helped us reduce the power supply noise. This is implemented on the 0.13u process.
- Designed charge pumps used both for pumping up the voltages and for generating negative substrate bias.
- Designed a fully trim able high voltage generating circuit. This circuit was designed to generate enough current drive and the ability to regulate the voltage level. This is used to program the non volatile memory NVM which is EEPROM in our case.
- Implemented and fully verified the JTAG state machine used as an interface to program the device from a JEDEC file using the interface JTAG interface pins. Used Verilog programming language to generate test vectors to verify functionality of JTAG.
- Generated vectors for testing out all the Boundary Scan functionality. This was all done using Verilog programming language.
CONFIDENTIAL
- Responsible for running a network of 30 SUN SPARCS for a Research Lab developing device process simulators and SOI SPICE, the first SPICE simulator for SOI technology. The staff included the original creator of the SUPREME process simulator and the leading authority of SOI technology.
- Upgraded all the systems from SUNOS to SOLARIS 2.4 operating systems.
- Wrote CGI PERL Scripts to help in administering tasks such as tape backup and new user registrations. Also wrote unix shell scripts to automate system tasks and add functionality.
- Setup and installed all the CAD tools including Cadence, 3 versions of Spice, process tools. Also do benchmark testing and evaluation of all tools.
- Configured cadence suite of tools and also provided tutorials to new users on how to use the tools.
- Closely work with users to provide support for CAD tools and Unix system issues.
- Negotiated with vendors for licenses
Computer Skills
- CAD Tools
- Cadence Schematic Entry, Layout, HSPICE, SMARTSPICE, VERILOG, TimeMill, Harmonic Balance, MATLAB, and ADS
- Operating Systems
- Unix Solaris , MS Windows, Linux, VMware, HyperVM, OpenVZ VPS, Xen VPS, Cloud Computing, Virtualization, Rhel
- System Admin Skills
- Solaris, Unix, Linux, Red Hat, CentOS, VMware, and Cloud Computing
- Databases
- MySQL, Oracle, MS SQL, DB2 UDB
Technologies
Java, J2EE, EJB3.0, JMS, JSP,JMX, JABX, JDBC, Servlets, Ant, ActiveMQ, ServiceMix, Jboss, Maven, ANT, Tomcat,SOA, Jibx,SOAP, Junit, Jquery,ActiveMQ, XML, Spring MVC, Hibernate, SVN, CVS, Shell scripting, Perl CGI, PHP, Clustering, Security, SAN/NAS, Cacti, Unix security, Apache, Avaya, Asterisk, FreeSwitch, SIP, VoIP, SugarCRM, Vicidial, H.323, HTML,NIO, VoiceXML, Websphere CE. GlassFish, Weblogic.
Area of focus: Electronics and Telecommunications