We provide IT Staff Augmentation Services!

Solution Architect Resume

5.00/5 (Submit Your Rating)

New York, NY

SUMMARY:

  • To pursue technology engineering position with 20+ years of software professional experience in development of high performance service oriented and event driven enterprise distributed computing systems.
  • Work experience in building broad range of large scale distributed applications across various industries including investment banking, financial insurance, telecommunication service and equipment provider;
  • Historical job roles demonstrate utilization of high quality Agile engineering best practice and core competency with various development job functions;
  • 20+ years of progressive and diversified experience in all phases of the software development life cycle (SDLC), including requirements discovery, analysis, design, development and testing and deployment, with a track record of successful high quality deliveries;
  • Skilled in quickly engineering the complex business problems with technical architecture, lead developing reasonable and flexible solutions, and communicating effectively;
  • Big Data Engineering and Advanced Analytics
  • Big Data Full Stack Expertise in building and architecting highly scalable, distributed systems using state - of-the-art technologies including Hadoop, Spark, Storm, Kafka, Flume, NoSQL Database, Cassandra, HDP, Typesafe Stack, AWS, SolrCloud;
  • Play a leading role in prototype, architecture design, development, testing, product deployment and delivery.
  • Vendor and open source technology evaluation on real time streaming, Complex Event Processing (CEP), NoSQL database, advanced data modeling and management, analytical tools and high productivity frameworks;
  • Experience of implementing and demonstrating of big data driven solutions related to predictive modeling, data mining, and research on large scale, complex data sets using machine learning, graph modeling, text mining and other modern techniques;
  • Skilled in designing and developing big data ingestion framework and architecture in the Hadoop ecosystems, including consolidate, validate and cleanse data from disparate sources and veracious formats;
  • Distributed Machine Learning and Data Mining Expertise
  • Solid academic background in machine learning, pattern recognition, data mining, predictive modeling and quantitative analysis;
  • Familiar with various machine learning algorithms such as Logistic Regression, Naïve Bayes, Neural Networks, SVM, Collaborative Filtering, LDA, Hidden Markov Model, Conditional Random Fields, Spectral Clustering, Decision Tree, Ensemble Learning, Ada Boost, Bagging, Deep Learning ;
  • Excellent working experience with mapping real life business requirement into machine learning model and building prototype data science solution with proofs of concepts to potential target business customers;
  • Specialized in maintaining and upgrading in cutting edge machine learning analytical tools such as Weka, Rapidminer, Spark MLib, Graph Lab, Mahout, H2O;

TECHNICAL SKILLS:

Big Data Hadoop Ecosystem: Hadoop, YARN, MapReduce, Pig, Hive, Solr, Spark, Storm, HBase, Accumulo, Kafka, Flume, Sqoop, Knox, Zookeeper, and Oozie;

Programming Language: Java, Scala, C/C++, Python, R, Unix/Linux Shell, Groovy, Perl, Java Scripts, PHP;

Real Time Streaming and Reactive Programming: Spark, Storm, IBM Info Stream, Akka, Play;

Platforms and Frameworks: Spring Data, Spring XD, Spring Boot, Typesafe;

Cloud Computing Infrastructure: Docker, AWS, OpenStack, Chef

NoSQL: HBase, Casscandra, MongoDB, Neo4j, Redis, Titian, AllgreoGraph

Graph Analytics: Triangle Count, Shortest Path, Connected Components, Page Ranking, Community Detection, Social Tag Clustering, Cypher, Gremlin

Semantic Technology: RDF/RDFS Triple Stores, OWL Ontology, SPARQL, FIBO, Jena, Sesame, D2RQ, Continuous Semantics, RDF Streaming;

BI and ETL Tools: Tableau, SpotFire, Cognos, Pentaho, SpagoBI, Data Stage, Informatica, Talend;

RDBMS and MPP: Oracle, Teradata, Netezza, Greenplum, Vertica

Test Driven Development and Continuous Integration: OOA&D, Agile, Scrum, Jenkins, Selenium, Cucumber, TFS, Maven, Gradle;

J2EE Enterprise Technology Stack: Servlet, EJB, JMS, JDBC, JNDI, XML, Web Services, JSP, RESTful API, JSON, jQuery, Spring ;

Machine Learning Algorithms and Models: Regression, NB, SVM, Deep Learning, Ensemble Learning; LDA, Collaborative Filtering

Analytics Tools: Weka, RapidMiner, Mahout, Spark MLlib, GraphX, Graph Lab, H2O, DL4j, TensorFlow, SPSS;

PROFESSIONAL EXPERIENCE:

Confidential, New York, NY

Solution Architect

Responsibilities:

  • Designed system as scalable, fault tolerant, low latency, distributed, parallelized, cloud stack solution using an ensemble of big data technology including HDFS, Spark, Strom, Kafka, Flume, ZooKeeper, YARN, Mesos, Cassandra, Scala, Akka, Play, and Docker;
  • Designed unified Lambda architecture with fast access to historical data view and near real time data from stream for the predictive modeling and analysis;
  • Designed and integrated data ingestion service with Confluent platform to support Flume, Kafka and Camus framework to stream data into Spark, HDFS and SolrCloud;
  • Designed and lead the development of fast data integration of reactive streaming model with Akka Stream, Slick, Play, Spark, & Kafka;
  • Built Typesafe activator reactive development environment and integrate ConductR in Docker VM clusters;
  • Adopted knowledge driven approach with big data strategy and intended to be instrumental to various NYL compliance, surveillance and audit activity, such as OFAC, AML, SAM, Claim Investigation;
  • Designed and defined enterprise OWL ontology model to describe the class structure and business entity in to Financial Industry Business Ontology (FIBO) standard;
  • Leveraged transformation rules in Hadoop process to load data into RDF triple store; Implemented the mapping of Pig UDFs to RDF format to integrate with Intel Graph Builder for data loading and conversion;
  • Built taxonomy solution for entity resolution, disambiguation and extraction with ML algorithms and models(LDA, TD-IDF) in topic modeling, document classification;
  • Designed and lead the development of the semantic and facet search, query parsing and augmentation, text tagging and indexing with Apache Lucene/Solr;
  • Designed the Knowledge Graph Data Science API to navigate different levels domains for business knowledge discovery and relationship analysis;
  • Evaluated and designed the data storage capacity and application infrastructure running across 8 application servers in CNJ/ADC data center and 12 database servers with total storage capacity of 20+TB, represented over 50M JSON documents.
  • Designed and implemented the RESTful Microservice API to integrate with various front end applications;
  • Lead the design and implementation of smart analytics application with Kiji technology to improve NYL customer retention and new business;
  • Lead the team to define, evaluate, deploy, and experiment with different Machine Learning predictive models with different business cases to enhance customer’s experience, improve operation efficiency and cross selling opportunities;
  • Designed and architected the data ingestion and processing toolkit using Spring XD, HDFS, Kafka, Flume, Spark and Cassandra to deepen customer relationships and data discovery;
  • Designed and implemented analytical reports with MapReduce, Pig and HAWQ;
  • Designed and developed visualization tools using Spring Boot, Angular and D3 to show analytics results and reports;
  • Involved the engineering process to enhance existing structured NYL data warehouse and data marts cross multiple LOB to support unstructured and semi structure data.
  • Lead the technology and engineering evaluation on Hadoop vendor including HDP, Cloudera, IBM InfoSphere Big Insights, Big SQL and Big Text Analytics;
  • Hands experience on installation, administration and monitoring of Big Data Hadoop Clusters;
  • Designed and defined architecture roadmap and technology standard of real-time analytics and NoSQL data stores;
  • Worked collaboratively with business units, data engineering and application teams with best practices and advised on vendor and tool selection;
  • Hands on proof of concept with Hadoop technology including Map Reduce, Spark, Strom, IBM Info Stream, HBase, Hive, Mahout, etc.
  • Assisted the analytics team to build NYL next generation strategic intelligent InfoBase to improve the effectiveness with new descriptive, predictive and prescriptive analytics model systems;
  • Explored and scrubbed a wide range of proprietary unstructured enterprise data stores, such as application logs, electrical forms, agent contract documents, compliance audit reports, transactions;
  • Applied new model and compare with existing methods to articulate pros/cons of various technologies/platforms;
  • Designed and architected ML framework for data capture, aggregation and enrichment, algorithm development and implementation with new and improve existing business cases, such as new business, underwriting, fraud detection, claim management, cross selling, compliance and auditing, customers retention, etc;
  • Maintained and supported commercial and open source ML solutions and library, such as H2O, Apache Mahout, Spark Xlib, etc;
  • Evaluated the business cases including experiment with selectively replacing costly and inconvenient medical exams with predictive modeling of risk; improving agent recruiting and retention by adopting more big data factors; enhancing lead generation and qualification through predictive modeling; underwriting on high-risk customers in exchange for insurance which had been unaffordable or unavailable before;
  • Designed application architecture and data pipeline, including pre-eligibility data scan, ETL, Eligibility analysis, Obtaining Portfolio Data, Statement Process, Correspondence Generation, History Analysis, Reporting;

Confidential

Senior Architect

Responsibilities:

  • iNautix J2EE Platform Framework: Lead technology and engineering evaluation on acquisition of Washington DC based online asset management startup company; Played vital roles on architecture, design and implementation of iNautix enterprise J2EE framework components such as J2EE container, DB Persistence, Presentation, Service Management, Operation and Management Console, Authentication, Entitlement and Cryptography, etc. Coordinated with different teams with design principles, technology adoption, architecture blueprint; Reported progress to Managing Director of Architecture and Infrastructure. iNautix PCFN/FIX Connectivity Platform: Self-developed online trading platform consisting of IPC Messaging system, INET Server, Message router, DB Server, and proprietary message format (TAP). Designed the unified FIX adaptor connectivity architecture and interface with various middleware vendor technology such as MQSeries, TAP library, TIBCO, FIX Javelin Appia, etc, and provide client side OMS connectivity to online retail and institution customers, global trading partner such as from CSFB Direct UK, Hutchison HongKong, etc.
  • Investor Service Management (ISR) for iNautix UK: Web based application to support trading and customer account management for online retail customers. Designed application and integration architecture using J2EE technology such JSP, Servlet, XML, EJB, and deploy in Weblogic server. Lead team from India and collaborated with London office to complete business domain requirement deign, deployment infrastructure, technology adoption, application development, etc.

Confidential

Technical Lead

Responsibilities:

  • Diagnosis Support System (DSS): AT&T next generation intelligent network management system to support network service such as ticketing, configuration, fault & alarm monitoring on network components; Lead team to build rule based intelligence knowledge management system; Incorporated knowledge from domain SME in Neuron Data engine; DSS project critically improved the resource allocated on network diagnosis and won AT&T annual software excellence in 1998;
  • Nodal Pre-Service Testing System Platform (NPSTS): NPSTS assisted filed technician to complete test and configuration before turn on service to customers; Lead team to develop web based application using Java, C++, OrbixWeb Corba, DSAP MIB, and WebLogic server.

We'd love your feedback!