- 18 years of experience developing and managing information systems.
- Experience with enterprise application software design and development on a variety of platforms.
- Experience with Object Oriented Analysis and Design.
- Experience with Data Architecture and Data Governance.
- Experience with Object Oriented Programming using JAVA
- Experience with Hadoop Big Data development using latest Hadoop technologies.
- Experience with Web Services and Rest Architecture.
- Experienced with negotiating and establishing good relations with business partners and users.
- Proficient Communicator
- Expertise with handling small and big teams.
- Skilled negotiator
- Process improvement
- Project estimates and resource planning
- Requirements gathering
- Expertise with development in Hadoop big data
- Expertise with Analysis, Design and Development
- Strong programming background using sql, plsql, java, j2ee and other web technologies.
Big Data Developer
- Developed several workflows to process Check, claims, 2waysms, collections and other data using oozie, hive, Sqoop, Yarn, Spark, DataFrames, java, scala on cloudera distribution of Hadoop.
- Used Flume to process near real time messages.
- Developed prototype to integrate Hadoop with splunk.
- Used sqoop password encryption tool to encrypt all passwords that are stored on servers.
- Extensively worked on spark with dataframes. Worked on performance tuning several spark jobs.
- Managed and enhanced two key Commercial Credit Pricing applications.
- Established good relations with the business partners and app users.
- Reduced App hosting costs and application downtimes by building new less expensive infrastructure using VM’s.
- Established centralized L1, L2 support and issue tracking by Onboarding application to support desk.
- Reduced risk by motivating and cross - training team members across multiple technologies and functional modules.
- Reduced application support time and increased efficiency by automating feeds, building data quality checks, email notifications with error details, moving all jobs scheduling to Confidential etc.
- Was responsible for end to end design, develop, deliver and support for credit pricing domain.
- Managed a team of 8-12 people. Both offshore and onsite.
- Developed several enhancements to the Credit Pricing tool. Eg: process and generating excel reports, write java schedulers for scheduling batch jobs, Build integration with Access Management Etc.
- Developed hadoop ELT jobs using sqoop, Hive, Pig and Oozie to source required data from various source systems.
- Using hadoop’s distributed processing, reduced processing time of a key process drastically.
Confidential, Oklahoma City and NY
- Worked in Confidential ’s v3 product development team. Working in this role I developed some key important features for v3 product. Technologies: Tapestry 3.x, Google Web ToolKit, Hibernate, Oracle, Maven, SVN, Jasper Reports, Weblogic, Jetty.