Senior Developer Resume
Pasadena, CA
SUMMARY
- Developer/Architect with extensive experience in Information Technology. Over 15 years of industry experience.
- Extensive experience as evangelist and technical liaison and between IT and customer organizations, driving the design and development effort in projects of size and complexity.
- Proficient in data architecture, developing, designing, and implementing large - scale, distributed multi-tiered solutions.
- Skilled at consulting and contributing to the initial concept design and development, building technical design specifications as well as preparing business and functional requirements.
- Expert at analyzing, developing and supporting complex applications with focus on flexible design, scalability and optimization for high performance.
- Excellent implementation, testing, profiling and debugging skills.
- Expertise in the SDLC to support business needs/objectives, creating scalability through planning, analysis, and reporting.
- Proven ability to articulate business issues clearly, adhering to requirements and documenting those issues appropriately to all levels of the organization.
TECHNICAL SKILLS
Scrum/Agile: Target Process, VersionOne
Continuous Integration/Build/Test Tech: Jenkins, Maven, JUnit
Software/Data/BPMN Design Tools: ERWIN, Star UML, Oracle Designer, Rational Rose, DBDesigner, System Architect, Borland Together, Visio, ARIS Express.
Cloud: AWS
Development/Integration Frameworks: Spring, Apache Karaf
BPM Rules Engines: Drools
Java IDE: JDeveloper, Eclipse.
Messaging Protocols/APIs: ZeroMQ, RabbitMQ, JMS, REST
Programming Languages: Java, Groovy, C#, C++, Python, PHP, ASP, PL/SQL, JSON/XML/XSL, Dynamic SQL, Cypher, Transact SQL, JavaScript, Perl, OMNIS, ADA, C/C++, JSP, Struts, VB6, PowerBuilder, Oracle Forms and Reports, Assembly.
Web Development Technologies: JSP, ATG Dynamo, JQuery, AJAX, Struts, HTML, DHTML, JavaScript, SSL, EJB, XSQL, Castor, Java, J2EE, XML/XSL, SOAP/REST, JAXB, JSF, Hibernate.
Database Technologies: AWS/RDS, Neo4J, Oracle Database, Oracle AQ, Cassandra, Hadoop/Cloudera/Impala/Spark, JDBC, SQL Server, Teradata, SSIS, SSAS, SSRS, MySQL/MariaDB, Informix, DB2, Postgres, Materialized Views, CTE (Common Table Expressions), MS Access, HeidiSQL, Toad, XSQL
Dashboard/Reports development: Tableau, Hoopla, SSAS/SSRS, SAS, Oracle Reports, Apache FOP, Crystal Reports, MS Access.
Productivity: Microsoft PowerPoint, Word, Excel, Access, Visio, Project, Open Office Impress, Writer, Calc, Draw, Math, Base, Apple Pages, Numbers, Keynote, Google Docs.
ETL/ESB Tools: Informatica, Pentaho, Talend, Oracle Warehouse Builder, Mulesoft
Server-Side Technologies: AWS, Oracle Application Server/ WebLogic, ATG Dynamo, Apache Tomcat, Microsoft IIS
Packet Capture Analytics: Security Onion (Bro, Elsa & Xplico)
Configuration Management: Git/Bitbucket, SVN, Microsoft Visual SourceSafe, PVCS.
ERP Systems: Oracle Order Entry, GL, AR, AP; SAP FICO, Payroll, ABAP Programming; iDempiere; ERP5
Operating Systems: Ubuntu Linux, Mac OS, Windows, Hyper-V, Raspbian.
4GL Tools: VB, PowerBuilder, Oracle Forms and Reports, OMNIS
PROFESSIONAL EXPERIENCE
Confidential, Pasadena, CASenior Developer
- Lead design and development of RTDE (Real-time Data Exchange) CDM and Mulesoft ESB integrating ADP tax reporting system with thousands of Federal, State and Local tax agencies nationwide, transacting in excess of a trillion dollars of payroll taxes on an annual basis.
- Design and maintain the ADP canonical data model enabling cross-application eventing.
- Leverage Drools and SparkSQL to implement complex event processing for predictive analytics to prevent data errors and resulting procedural escalation that consume thousands of man-hours, not to mention agency levies that add up to millions of dollars in the course of the year.
- Develop event sourcing system implementing Kappa Architecture using Oracle 12c and native JSON indexing to support event playback for system synchronization, batch analytics and error analysis, extending existing real-time systems into full-spectrum data services.
- Develop graph-based data systems to realize the benefits of ERP5-style abstract class models for the implementation of goal-based, workflow generation capabilities.
- Technology: ERWIN, StarUML, CA Rally, Mulesoft, Oracle, Oracle NoSQL, RDF, PL/SQL, SparcSQL, Flink, MongoDB, DB2, Flink, Drools Fusion, JSON/XML/XSD, Node.js, JAXB, JMS, Kafka, ZeroMQ, Spring, Spring Batch, Camel Seda, API/REST, Tableau, IntelliJ, Visual Paradigm, Git/Bitbucket, Jenkins, Java, Ajax, Angular, Bootstrap, Slate, HTML, HTTPS, AIX
System Architect
- Led the inter-departmental effort to achieve real-time integration of people, processes and systems across all stages of the DWC Build-to-Order Workflow.
- Domain modeled the data relationships and required interactions between the Sales, Procurement, Accounting and BI Systems as a design baseline for an inter-application messaging taxonomy extensible into the B2B realm for EDI transactional connectivity between DWC and its customers.
- Developed multi-threaded Talend ETL workflows to data synchronized legacy Sales/Procurement/Inventory control systems and custom next-gen solution enabling parallel DML round-robined across 8 concurrent database connections stacked 4 deep that transformed 200 complex table relationship containing in excess of 3 million rows within 26 minutes, exceeding the performance of the most efficient bulk-load solutions by a factor of 4, enabling dual-use design for periodic as well as transactional real-time synchronization across the Digital Nervous System.
- Created Digital Nervous System using ZeroMQ, ZBeacon Protocol, PHP and Java as enabling technologies for a Chord-like P2P communication network of decentralized self-discovering message proxies, an implementation in effect of the Proxy/Agent-Service Design Pattern to integrate services and applications in a format-neutral, location transparent way, while guaranteeing payload delivery and sustained transactional throughput of over 100 thousand messages per second.
- Installed and configured open source iDempiere multi-tenant ERP to leverage configurable transactional workflows, “table as API” data architecture and a modular plug-n-play extension framework as agent service plug-points for Digital Nervous System integration between internal and partner systems, building out an industry-standard, DWC-branded, build-to-order ecosystem.
- Propagate orders and quotes from Sales and procurement data from Accounting into schema-less Neo4J where it can be combined, dissected and re-purposed without incurring the usual overhead for data re-indexing and partitioning, making it possible to generate real-time streams across the Digital Nervous System for trend visualization in Hoopla and Tableau Online.
- Installed Drools as a centralized clearing house and general interceptor of all events flowing through the Digital Nervous System with OptaPlanner integration envisioned for first quarter 2016 to as a means to inject agility and flexibility into business processes and to optimize procurement and logistics workflows.
- Replaced Google Cloud Print with home-brew PHP agent services, leveraging the Digital Nervous System to spawn and pool-manage worker processes that concurrently generate, then propagate, shipping and inventory labels from a centralized sales and inventory control application to Raspberry/CUPS endpoints deployed at 8 warehouse facilities, nationwide, virtually eliminating failed transmissions and system downtime, shaving 30 hours from the overall build-to-order workflow and reducing maintenance and support requirements by 16 hours every month.
- Technology: ERWIN, Talend (ETL), ALM/Target Process, Oracle, PL/SQL, MySQL/MariaDB, Neo4J/Cypher, Cassandra, Hadoop/Cloudera/Impala/Spark, SQL Server, T-SQL, Postgres, AWS/RDS, Drools, HeidiSQL, SSAS, Tableau, Hoopla, JSON/XML/XSD, SOA/SOAP/REST, Javascript, Node.js, RabbitMQ, ZeroMQ/ZBeacon, Git/Bitbucket, Jenkins, Maven, Java, Ajax, C#, PHP, Python, ASP, HTML, HTTPS, Spring, JAXB, JMS, iDempiere, ERP5, Ubuntu Linux, CUPS, Windows Server, Hyper-V, VMWare, Raspbian Linux
Job Title: Data Architect
- Lead a company-wide re-engineering effort to increase efficiency and productivity of our product development workflow by a factor of 100.
- Domain-modeled the interactions and relationships of the upstream value chain of the McGraw Hill Enterprise Solutions Expressfeed System (ESXF) to establish the necessary baseline for refactoring, integration, optimization and automation.
- Architected, in coordination with global design and development teams from New York to Mumbai, the End-Of-Day(EOD)/Quant House Package automation process from metadata creation to data feed generation, shortening time to market of Cap IQ information products from weeks to hours.
- Architected late-stage feed customization capability that enables Cap IQ to tailor-make financial reporting products that the among infinite possibilities will enable market securities to free-float across exchanges while preserving product integrity by maintaining access to all affected securities to the end of the subscription period for each and every customer.
- At S&P, we publish market-close analytics in the form of multi-billion row data extracts packaged across several international time zones and financial markets for consumption by the institutional investment community and partner companies on a daily basis within a mandatory 3 hour window. In an effort to incorporate a continuous stream of corrective edits and after-market updates made throughout the day to the maximum extent possible, a late-stage customization capability was implemented using Talend (tImpalaLoad), moving in CDC events to populate and update Hive-generated HBase tables sharded across 100 EC2 Region Servers. Significantly enhanced transactional throughput from the resulting MPP (massively parallel processing) enabled real-time integration of millions of late-stage updates while significantly improving the performance and reliability of our build-to-order workflow.
- Coordinated between Business and Development in developing new Data Feed products.
- Support QA processes, to help minimize defects and downtime and to improve communication between Development and QA organizations in Denver and Mumbai.
- Streamlined Xpressfeed Tech Design to improve intuitiveness and design communicability in addressing current and future business needs.
- Technology: ERWIN, StarUML, Informatica, ALM/VersionOne, Oracle, Hadoop/Cloudera/Impala, Cassandra, SQL Server, Toad, HeidiSQL, SSIS, SSRS, Talend, Java, Hibernate, XML, XSLT, Java EE, Weblogic, JDBC, Linux.
Applications Architect
- Primary liaison between IT and Quality Control Organizations at Boeing/ULA, documenting user stories, leading the design effort to convert stories into development objectives, flushing out critical system integration requirements, setting coding priorities, providing training and driving the development effort from end-to-end.
- Developed and maintained the Product Integrity Reporting System (PIRS), the quality control system for both the Delta and Atlas product families.
- Build and maintained Enterprise Data Warehouse and all in-bound/out-bound PL/SQL and ETL data interfaces including interactive web-based and scheduled reports using Oracle, Informatica, Pentaho and Talend.
- Developed and maintained mission-critical Test Requirements Document (TRD) generation engine for the Delta Rocket using Java, JSP, REST, XSL, JDeveloper, Oracle DB and Apache FOP.
- Leveraged NoSQL and MongoDB to develop next-gen TRD repository services enabling storage and reuse of infrequently revised document artifacts, reducing report generation overhead by 40%.
- Developed enterprise listener service to monitor asynchronous processes, converting error notifications to emailed bug reports via JSON web service integration to Bugzilla with dashboard development in AJAX, Struts and Hibernate.
- Developed and customized multiple in-bound and out-bound interfaces to SAP Supply Chain Management and XI interfaces to FICO.
- Technology: ERWIN, StarUML, ALM, Rational Rose, Informatica, Pentaho, Talend, Oracle, Oracle AQ, PL/SQL, SQL Server, Toad, NoSQL, MongoDB, Perl, XML/XSLT, SOAP/REST, JSON, JavaScript, Hibernate, SAP, XI, SAS, Nagios, JDBC, Java, JPA, Junit, JAXB, JMS, JSP, Groovy, Apache, Tomcat, J2EE, VB, AJAX, JQuery, Struts, Bugzilla, OMNIS, Oracle 10gAS, JDeveloper 10g, Apache FOP, JavaScript, HTML, Linux, UNIX.
Principal Database Architect
- Led a team of ten analysts, DBAs and SQL programmers in the design and development of the Physician’s Credentialing Data Warehouse. It integrates more than 150 data sources that range from the Drug Enforcement Agency to the American Medical Association to drive strategic web-based applications like Practice Match Select that enable staffing support at HMOs to locate medical specialists by selecting a combination from among 73 distinct dimensions that include Degree Type, Board Certification, State Licensing, Gender and Location. The warehouse, a multi-dimensional database, maintains business elements in granular form, representing every possible combination in selection criteria to provide the highest degree of flexibility across applications. To prevent exponential growth in storage requirements, I vertically partition the cube and distribute data across several machines. To preserve the view of an integrated data warehouse, I invented a query optimization procedure that enable distributed, bitmap resolution queries, previously impossible in Oracle 8i. This architecture enables response time within 2 seconds of a conventional multi-dimensional design but requires only 1/10,000 the space. Using Informatica’s Warehouse Designer, I create the cubes by designing dimensions and their defining hierarchies and levels. Diverse data sources are integrated diagrammatically using Informatica PowerCenter to promote uniformity and to enable graphical impact and lineage analysis. Complex data transformations are developed in Informatica to leverage array processing provided by its Scalable Pipeline Processing Architecture while I/O operations are coded as multithreaded queries to leverage Oracle’s Partitioning and Parallel Options. To provide OLAP functionality, I implemented Business Objects which enable downstream viewing of upstream data definitions.
- Technology: ERWIN, Informatica, Oracle, PL/SQL, Business Objects, Java, XML/XSLT, Linux, Windows