- Solutions Architect / Sr. Technologist with Specialty in Data Engineering
- Leverages business acumen and hands - on technology experience to deliver innovative solutions
- Hands on Sr. Technologist with strong java and data engineering background in financial services industry, skilled in delivering viable solutions which positively impact organizational capability and performance.
- I offer blend of excellence in technology development and people management with proven ability to plan and execute complex initiatives and manage multiple work streams.
- Data Lake platform: Data ingestion and transformations using Big Data technologies such as Hadoop, Kafka, Hive, Druid, Spark using Scala or Java, Dreamio
- Responsible for important technical decisions and selection of suitable cutting edge open source development technologies, tools and methodologies
- Data Analysis proficiency using Excel or reporting tools such as Tableau with Presto acceleration
- Experience in AWS Cloud Application Architecture along with DevOps proficiency helped drive value through prudent development, and release management processes.
- Strategic thinker capable of understanding complex business problems to identify strategic opportunities, determine business requirements and build technology solutions.
- Specialties in Reference Data, Prime Brokerage, Stock Loan, Confidential and Risk Data Engineering technology applications.
Big Data Ecosystem: HDFS, Hadoop, HBase, Cassandra, Spark
Solutions Architect / Lead/ Sr. Technologist
- Solutions Architect / Lead role for Reporting and Controls for Confidential in Cloudera big data environment. Hands on proof of concepts / development for reporting in technologies such as spring boot, Druid, Hive and Apache Spark using Scala / Java in primarily Hadoop environment.
- Confluent Kafka based solution engineering to provide integration between Risk IT applications events, exceptions and batch monitoring.
- Develop ETL jobs in Spark Scala or Java to produce flat canned report data for reporting needs curated for Tableau or Druid.
- Lead Design and developed adjustments and overrides framework to support Risk IT day-to-day operations to compensate for data gaps, critical process in Risk IT using Spring boot container with Scala as programming language on Cloudera platform with spark using Parquet format with Hive external tables.
- On boarded Risk IT by developing a lightweight framework to support Druid OLAP platform data ingestion for sub seconds data analysis. This tool is preferred analytics tool for Middle Office.
- Evaluation of multiple workflow orchestration tools such as Netflix conductor / Airflow
- Key contributor in cloud migration and working closely with cloud specialists to establish DevOps pipeline with proof of concepts in cloud infrastructure such as PCF, AWS. Comfortable in AWS web services like EC2, EBS, IAM, S3, ELB, Redshift, RDS, VPC, Route 53, Cloud Watch, Cloud Formation etc.
- Work closely with project manager and product manager and sr. technology executives responsible for delivery to create roadmap of IT deliverables and facilitate delivery in agile methodology.
Languages/Technologies: Java 8, Scala, Spring Framework, Spring Boot, Spark, Hive, Druid, HDFS, MS SqlServer, Microservices, Netflix OSS, Jenkins, Automic, Jira, Gitlab, Angular JS 1.6
Confidential, New York, NY
- Hands-on Lead Engineer/Architect (including visioning and architecting of IT landscape) of risk reference data sourcing and adoption across risk verticals such as Confidential, Credit Risk predominantly using server side technologies such as Core Java, Unix, multithreading, design patterns, data structures.
- Developed and delivered risk technology metadata services web applications using full stack technologies such as Angular JS, spring boot, JPA, Hibernate, which provide Risk data users to visualize data relationships and consumption paths hence providing contextual information.
- Enabled faster data distribution for on demand analysis (1 Hr. to 5 seconds ) to risk data consumers by leveraging compression ratio of Parquet files and serving them using Apache Spark, Spark QL.
- Streamlined and alleviated SQL Server ETL processing ~400 GB daily, 10TB live database bottlenecks contentions using Distributed synchronization / Lock pattern using apache zookeeper, multithreading.
- Led global team of 20 developers in BCBS data strategy project involving global cross-functional teams as well as high-visibility stakeholders in middle office, change management, and senior IT leadership, achieving simplification on software platform and accurate Confidential reporting by migrating consumption of reference data from multiple front-office feeds to source mandated by firm.
- Spearheaded tools and process standardization in collaboration with Operations and Production support team to set up DevOps framework of best practices for technology rollout to production, significantly reducing technology operational risk and achieving annual cost savings of $1M in operational costs.
- Consolidated application configuration management, build cycle CI / CD using Zookeeper, Gitlab, Jenkins, Bamboo, Nexus Repository, Automic.
- Executed key regulatory requirements by delivering application changes for calculation risk measures such VaR, SVaR, as well as IRC/CRM capital requirements for given reporting period.
Languages/Technologies: Java 8, Spring Framework, Spring Boot, Spring Security, JPA, Hibernate, MySQL, MS SqlServer, Microservices, Jenkins, Bamboo, F5 ( load balancer), Automic, Jira, Gitlab, Angular JS 1.6, Bootstrap, D3, Agile methodology, Test Driven Development, Apache Spark, Zookeeper, MS Project.
Confidential, New York, NY
- Technical Lead and driver for the team of 9+ developers and QA engineers to execute a development for economic credit capital model, including construction and integration of credit and capital limits, enabling provision of strategic advice and solutions to originating businesses.
- Integrated restful limit services within organization to aggregate them using spring integration based java application and cache them in Gemfire indexed by date with event driven updates for superior user experience in reporting user interfaces of credit and capital limits.
Languages/Technologies: Java 8, Spring Integration, TIBCO, Oracle, REST API, Gemfire Cache, Microservices, Jenkins, Jira, SVN, Agile methodology, Test Driven Development.
Confidential, New York, NY
- Worked on all green field projects, building the team, hiring, mentoring for team of 5+ developers with additional matrix reporting of 4+ QA engineers in team.
- Led execution of Trade Management functionality for Hedge Funds Client as well as internal Account managers for straight-through processing of trades, increasing operational efficiency through automation in prime brokerage account/trade management processes using java, axis2 for web services.
- Designed and developed Client facing application that lets Hedge Funds book borrow and provides end to end interface to able execute that borrow and monitor its progress. Server side components used SOAP/XML API’s, core java, and multi-threading framework using concurrency API, spring JMS, spring integration, tibco ems .
- Delivered successful execution of centralized prime brokerage data services for transactional and reference data such as Activity, Position, Cash Flows, Instrument, Prices, Client, and Account using JBoss Cache, CXF for web services, Spring technologies for integration with databases, messaging communication.
- Collaborated with business product development group, delivering reporting requirements for products upon support by clearinghouse per OTC2CCP requirements.
- Coordination and collaboration with Infra teams for release and worked with support teams to handle production issues, finding workaround in firefighting situations.
Languages/Technologies: Java 6, concurrency/multithreading apis, spring Framework, spring Integration, sybase, oracle, tibco ems, CXF, JMS, REST API, Jenkins, Jira, SVN, autosys