- Over 15 years of experience in Software systems development, integration and Implementation of core business applications using Java/J2EE, and .NET technologies spanning multiple platforms within Mortgage, Retail and Financial services domain.
- Over 2 years of experience in the design, development and Implementation of Big Data Solutions on Hadoop ecosystem computing infrastructure (Cloudera).
- Hands - on experience in developing solutions to process large sets of data using Hadoop ecosystem: HDFS, Map Reduce, Pig, Hive, SQOOP, Flume, Oozie
- Experienced in all aspects of data management, data architecture, data integration and Implementation of Business Intelligence/ETL applications.
- Experienced in the design, development and implementation of scalable application and data integration solutions using SOA (SOAP and REST web services)
- Experienced in Object oriented analysis, design and programming in Core Java and .Net.
- Experience in managing highly critical and complex projects as Lead Architect/Tech Lead/ project lead.
- Experienced in managing Clients, Vendors and other stakeholder’s expectations.
- Experienced in managing Onshore/offshore delivery teams on several projects in a global matrix environment.
- Working knowledge and experience on Agile methodology and Scrum process.
- Quick adoption to new tools and technologies, problem solver, effective team player.
Programming Languages: Java, C#.Net, Java Script, SQL
ETL Tools: Data Stage 8.5, Informatica
Big Data Analytics: HDFS, Map Reduce, Pig, Hive, Scoop, Hbase, Oozie, SQOOP, FLUME and Tableau.
Databases: Oracle, DB2, MySQL, SQL Server and Teradata
Tools: TOAD, AutoSys, SPLUNK, AppWatch, ANT, JIRA
Technologies: J2EE, .NET
Integration Technologies: SOA, SOAP and REST Web Services, IBM WebSphere MQ Message broker, TIBCO Business Works
Methodologies: Waterfall, Agile
Big Data/Hadoop Lead Developer
- Involved in the design, development of data pipeline, storage and processing platform (Hadoop data ware house using Cloudera) to capture and process very large mortgage related data sets from multiple sources/channels within the mortgage domain and external vendors using Hadoop ecosystem technologies.
- Involved in gathering and analyzing current and future ingestion requirements from each source system, creating design documents, review data sources, data formats, recommend processes for loading data in to Hadoop.
- Prototype solutions on large data sets to improve data processing performance and review deliverables with stakeholders.
- Hands on experience in implementing data acquisition, storage, transformation and analysis solutions using Hadoop ecosystem components - HDFS, Map Reduce, Pig, Hive, Flume and Sqoop.
- Experienced in writing Pig Latin scripts for handling data transformations, data loading to stage and production hive tables and analyzed data using Pig, MapReduce, Hive QL and HCatalog.
- Created Hive tables to store data into HDFS, processed data using Hive QL, exported Hive tables to relational data stores using SQOOP for visualization/reporting. Developed UDF’s and implemented them in HIVE queries
- Hands on experience with sequence files, RC files, Combiners, Dynamic partitions, bucketing for best practice and performance improvement.
- Used FLUME to collect, aggregate and store the weblog data from different sources like web servers, network devices and pushed to HDFS
- Analyzed web log data using HiveQL and identified opportunities for improving service levels.
- Supported ad-hoc queries across large data sets and responsible for troubleshooting of Hadoop application and customizations.
- Experienced in using Oozie and Autosys to schedule Hadoop jobs
- Experienced with Linux programming environment and automation of ETL & Data transfer jobs.
- Good understanding of Spark, Kafka, STROM, No SQL, HBase.
- Working knowledge on Tableau data visualization tool and designed and published interactive Tableau workbooks and dashboards.
Application and Data Integration Lead Developer/Architect
- Key player in the countrywide and Confidential merger and integration effort. Worked in the systems and data integration space and provided thought leadership and direction for developing SOA based solutions for integration of Mortgage Servicing and core banking systems.
- Involved in the design, development and delivery of i-Share application using IBM Data Stage 8.5 to transform and load data across multiple source and destination nodes. It is a middleware layer between mortgage servicing system (IBM i-series) and 280 + up/down stream applications within mortgage domain. This application is used to support business critical functionalities with in home loans department
- Designed and developed Data stage ETL PX jobs which extracts data from sequential files (different formats), apply transformations on the incoming data (which includes joining with other sources to get additional data and doing lookups for reference information) and finally stored the data into target I series database.
- Designed and developed the Outbound processing jobs using Data stage which includes extraction of Data from I series database and apply transformations on the extracted data and send the final result to vendors using ftp stage and at the same time storing data into Oracle database.
- Architected data storage jobs (parallel jobs, sequence jobs and shared containers) to read input file and load data into customer databases resulting in performance enhancement.
- Responsible for making sure any gaps in quality or performance are quickly identified and addressed.
- Implemented process automation through auto scheduler and shell scripts to execute ETL jobs.
- Paired with developers when needed to help them catch up on business knowledge.
- Involved in technical discussions during planning and design sessions as part of every sprint
- Created ETL process flows and low level design docs and provided detail instructions to the team members
- Designed and delivered time critical Integration solutions for foreclosure, Bankruptcy, Fees, CASH, Loan modifications and Case Management using i-Share framework
- Designed, delivered and managed enterprise level events handling frame work (Publish/Subscribe model) for document management using IBM middleware technologies.
Environment: JAVA/J2EE, JDBC, JMS, Spring, Hibernate, IBM i-Series(AS400), DB2, Oracle11g, IBM data stage v8.5, Teradata, IBM Message broker v7.0, Ant, SVN, AutoSys, SOAP and REST Web services, Shell scripting, SOAPUI, Splunk, Hadoop, Map-Reduce, Pig, Hive, Flume, SQOOP.
- Involved in the migration of Real Estate lending Servicing/Default systems on to a single common platform (Citi link) and lead two integration projects as part of this migration effort.
- Involved in the application design, development and implementation of Citi’s Debt Restructure Strategic Initiative project and Successfully delivered ETL applications and Dialer projects under this program
- Delivered Citi’s DRI vendor projects which helped replace many cumbersome mainframe screens with web based solutions. DRI is a default system used to service defaulted loans
Environment: Servicing and Default Platforms (DRI, CITILINK, MortgageServ), Java, J2EE, SQL 2000 Server, Informatica and Cognos.
- Multi brand property search feature was enabled for various channels of Cendant Hospitality group using search and booking services. Search and booking Services were developed using TIBCO business works and Java/J2EE.
- Lead project lifecycle from project start up through deployment using standard development and delivery methodologies and managed team of 15 onshore/offshore resources and involved in design, development and implementation of Search and Booking services project.
Environment: Java, JSP, Servlets, EJB, TIBCO Business Works, XML, Oracle 9i, SQL Server, Load Runner
Senior Systems Analyst
- Lead requirements gathering, analysis, design, development and delivery activities for Seranova Corporate Information Portal. Project goal was to automate corporate processes and to provide centralized information access to all employees.
- Implemented Coreport portal with in multiple operating companies of large insurance holding company, FairFax. Coreport portal provides with a single, open strategic framework for deployment, integration and management of all enterprise assets.