- Designs, develops and maintains large - scale realtime event and log data processes in a Confidential Spark AWS environment using Java, Scala, & Python.
- Designs and develops scalable data stores with sub-second query latency on massively parallel processing (MPP).
- Understands the end-to-end operations of complex Confidential -based ecosystems and managing/configuring open source core technologies such as HDFS, Pig, Hive/Beeline 2, MapReduce, YARN, Hbase, ZooKeeper and Kafka.
- Harnesses data between and within Confidential clusters to integrate with other ecosystems (for instance, cloud storage), configures replication, backups, and resiliency strategies for data on the cluster.
- Confidential Hive, Pig, MapReduce, Sqoop, Oozie)
- Agile & Waterfall methodologies
- Kafka S3
- Spark / Scala EC2
- Storm AWS Aurora
- Lambda Architecture
- Apache Aurora
- Akka Matilion
- Mesos SharePoint 2010/2013
- Ansible C#
- Zeppelin Classic ASP
- Jupyter VBScript
- SQL Server
- Linux (Bash AWK)
Confidential, San Francisco, CA
Relevant Technologies: Kafka, Spark, Beam, Java, JSON, YAML, Zeppelin, Confidential, Mesos, Ansible, Linux Distros (Cloudera, CentOS), Jenkins, Rundeck, Confluence, JIRA, Gitlab Repository Access Tokens, Windows 7/10 Enterprise & Linux distros
- Created data pipeline with 4 other developers to implement fault-tolerance in a scaled-out distributed environment
- Optimized the company's bottlenecked workflow system from ~4,500 message writes per second to ingesting ~2M message writes per second by switching from RabbitMQ to optimal Kafka configuration partitioning
- Automated from S3 buckets 837 shard claims to XML to Parquet via Spark 2.1.0/Scala 2.11.8 transformation and actions.
- Created SDK in Scala including Kyro Serialization and Docker containerization for Spark implicit conversions and data serialization
- Created automation pipelines for EDI to XML to Parquet transformation
- Used Spring MVC to display batch interval content and Kibana for dynamic data ingestion content via NiFi preprocessors
- Used kbsh, bash scripting to create variables and autiomation for UNIX and LINUX system automation
Relevant Technologies: Confidential, Spark, Java, Cassandra NoSQL, AWS Amazon Web Services Ecosystem, Docker, Talend, Matilion, JSON, YAML, XML, Zeppelin, Mesos, Ansible, Linux Distros (Cloudera, CentOS)
- Created running multi-node Kafka, Cassandra, Spark clusters & Docker Mesos master-slave agent containerization on AWS RHEL images for AWS-based ecosystem using streaming online and batch offline Lambda Architecture
- Automated Spark integration on AWS Elastic MapReduce and AWS Redshift using Talend
- Extracted, transformed and loaded data from data warehouse (Redshift)
- Implemented data pipeline so that data can seamlessly move from S3 to Redshift
- Used bash scripting to create variables and autiomation for UNIX and LINUX system automation
Relevant Technologies: Confidential, Spark, Storm, Trident API, Java, Teradata, AWS Amazon Web Services Ecosystem, JSON, XML, Linux Distros (Cloudera, CentOS)
- Enabled automated system to import data from legacy RDBMS driver sources, streaming API HTTP/online
- Implemented Storm to pull feeds from various intranet and extranet B2B sites
- Constructed and activated muti-node Storm topologies to run on top of Nimbus/Supervisor daemons
- Extracted, transformed and loaded data from data warehouse (Teradata & Redshift)
- Bash scripting (Linux distros)
Confidential, New York, New York
- Was responsible for creating Custom Web Parts to create new Ideas, specify number of Phases it will need to get completely implemented and then setup a Subsite for managing the state of that idea
- Used Web Services (both self-developed in .asmx file extension as well as utilizing the OOB (out-of-the-box) .svc Web Services that come with SharePoint to retrieve lists of different SharePoint objects residing on the server as well as built-in APIs that come with SharePoint.
- Utilized SPFarm, SPServer, and SPSiteCollections enumerated objects in order to keep track of all the farm-server relationships and quantity of such SharePoint objects which were online for the whole entire farm.
- Used Play Framework for Java to rapidly display ad hoc SharePoint results to clients.
- In order to increase security and control on the SharePoint sites via ASP.NET web.config, I would write a custom script to add verbage that only lets certain file extensions execute regarding what type of HTTPHandler request would be involved in order to add seamless HTTP Redirection and error URL display.
- Created a Tab View Web Part to display the various phases of each selected idea and enable editing of the properties in each phase, people responsible for that phase and track status of each task
- Created an Alert Flasher Footer Control that would roll up alerts and notifications about the idea from a Custom Web Service
- Enabled AJAX in SharePoint to make the Alert Flasher Footer to automatically update itself every 30 seconds with the next alert
- Was responsible for Managing the Tasks for the Testing and the UI Design Team and ensuring that they have all the necessary Tools, Builds and Servers to perform their duties
Confidential, Chicago, IllinoisSoftware Developer
- Worked as part of a large team of Developers and Testers to build the portal
- Built many of the pages in the Ticketing Engine in J2EE (Java) & .NET.
- Worked on all Tiers in the multi-Tier application.
- Worked on the Database Tier to retrieve database data into Business Objects
- Worked in the Business Tier to in corporate the business rules such as Ticket Expiry, Escalation, Department Assignments etc.
- Created ASPX pages, CSS style classes and worked on the presentation tier.
- Used ASP.NET AJAX to provide better UX to the users as they navigated through the multiple screens to submit their tickets.
- Automated batch files by writing COM business objects in Visual Basic 5.0 and in classic ASP
- Written and implemented Crystal Reports 4.6 - 8.5 automation using the 32-bit SDK Software Development Kit to import TTX files and render them as Crystal Reports format