Sr App Delivery Manager/ Big Data Solution Architect Resume
Charlotte, NC
SUMMARY
- Technology and business visionary with 15+ years of experience in planning, developing, and implementing state of the art information solutions in Banking, Insurance and telecom industries using Big data (Hadoop), Massive Parallel Processing (MPP) and Java/J2EE technologies.
- Managed cross - functional teams with diverse technical backgrounds. Adept at building teams, solution architecture, problem-solving and negotiating. Well- rounded IT background of infrastructure architecture (Hadoop cluster builds and MPP databases builds ) and software development management.
- Has total of 15+ years of extensive experience in Application and Infrastructure delivery / Team building, Analysis, Design, Development in Big data (Hadoop), Netezza, Grid Computing and Java/J2EE technologies.
- Worked under various capacities such as Sr Application Manager, Technical Manager, Big data (Hadoop) Solution Architect, Project Lead, Integration specialist, SME and a developer.
- Has extensive work experience with big data architectures including Hadoop, HDFS, MapReduce, Pig, Hive, Flume, Sqoop, Oozie, Spark, Spark Streaming.
- Has extensive work experience and knowledge of real-time data processing, real-time analytics, big data reporting, data modeling, big data analytical tools (SAS,Tableau)
- Has good understanding using Machine learning techniques to build models. Worked with external vendor and built proactive travel recognition model for Customer Fraud LOB.
- Exposure to Banking and Financial (10 yr), Telecom (4) and Insurance (2 yrs) verticals.
- Has 10 years of expertise in leading and managing onsite/offshore models.
- Manage / Lead teams of sizes up to 25 members.
- Expertise in understanding and development an n-tier software development life cycle in distributed applications.
- Thorough knowledge of complete Software Development Life Cycle.
- Proven ability to deliver innovative solutions that fully supports corporate growth objectives.
- Has thorough knowledge of project management principles.
PROFESSIONAL EXPERIENCE
Confidential, Charlotte, NC
Sr App Delivery Manager/ Big Data Solution Architect
Responsibilities:
- Architect, Design, Manage and deliver Hadoop data and analytics platform for Consumer Fraud business strategies.
- Work with various technology teams in the Fraud organization and migrate data and processing to Hadoop platform.
- Work with business Fraud Business Strategy teams and migrate them to Hadoop platform.
- Build Hadoop clusters for Big data analytics, Big data ETL,Big data reporting, Machine Learning and Operational Use- cases.
- Create Technology Roadmap and Strategy which rationalize the existing application portfolio and infrastructure.
- Integrate existing business User tools to Hadoop technology.
- Delivery reporting capability on Hadoop platform.
- Delivery machine learning capability on Hadoop platform.
- Architect and delivery big data solutions for real-time use-cases.
- Manage Hadoop implementation teams.
Technology: Hadoop, Map/Reduce, HDFS, Hive, Impala, Pig, Ooze, Sqoop, Hue, Flume, Cassandra PC/SAS, Cloudera Manager, CDH 4.x/5.x, Spark, Jaava/J2EE, Oracle, DB2, SQL Server, Mainframe Environment, Teradata, Eclipse, AutoSys
Confidential, Charlotte, NC
Hadoop platform Manager and Solution Architect, Credit Risk,
Responsibilities:
- Retire and migrate (including data sourcing and transformations) the legacy oracle credit risk exposure datamart to Hadoop platform.
- Built common data sourcing and transformation platform and eliminate redundant ETL processes and technologies.
- Build common data quality score card/ dashboard (Datameer) on common data staged on Hadoop platform.
- Run analytics on Hadoop to support Quant needs (MC Simulation, Optimization, adhoc analytics, etc )
- Create Technology Roadmap and Strategy which rationalize the existing application portfolio and infrastructure.
- Prepare Charge back model for Big data (Hadoop) shared platform
- Document Big Data reference architecture and framework and enforce all LOBs using the reference architecture standards.
- Document standards and best practices for HDFS folder organization.
- Gather Hadoop space and processing needs from multiple LOBs and come up with right cluster sizing for new cluster builds.
- Actively participate in Hadoop Interested CoP meetings in the Bank and provide necessary guidance and support on enterprise issues like Hadoop/Active Directory Integration, Kerberos Integration, etc.
- Conduct various POCs with Hadoop Ecosystem tools (MicroStrategy/Hive/Impala, Hive vs Pig, Java vs Python, Flume vs NFS, Netezza vs Hadoop, Sqoop vs TalenD, Oozie vs TalenD, Hadoop/R, Hadoop/Matlab, Hadoop/Datameer, Hadoop/Drools, Hadoop/Teradata/Netezza, WebHDFS vs HDFS, Cloudera vs MapR etc).
- Work with TI to build five Hadoop clusters (Prod/50, DR/50, UAT/20, Dev/20, R&D/20 nodes)
- Configure and administer Hadoop clusters.
- On-boarding new LOB applications
- Promote use of Hadoop in other LOBs ( Enterprise Credit Risk and CFO risk )
- Manage Vendor and SCM relationship; and establish Software License and Maintenance Agreement (SLMA) with Cloudera.
- Conduct trainings and KT sessions across multiple LOBs.
- Share the Hadoop platform and help other Hadoop interested groups in the bank
Confidential, NC
Application Architect and Infrastructure Manager, Credit Risk
Responsibilities:
- Engage with TI and build five Netezza (P 24) environments, 5 Confidential ’s Risk Frontier (Grid computing) environments, 5 Talend environments using 10 G and Fusion IO cards for Enterprise Asset Aggregation (EAA) program.
- Own End-2-end architecture responsibilities and design standards for EAA program.
- Manage Netezz and Talend Vendor relationship
- Conduct training and KT sessions across multiple teams.
- Conduct Tech SOS meetings, Design review and code reviews.
Confidential, NC
Technical Delivery Manager
Responsibilities:
- Was responsible for delivering the Mark-to-Market Assessment of Relationship Credit (MARC) application.
- The M ark to Market A ssessment of R elationship C redit tool is the source system used to calculate market valuations of corporate credit facilities of deals in pipeline, as well ongoing revaluations of existing and closed transactions. The MARC tool calculates a mark by comparing the expected return on the loan to an observed or implied secondary market return for comparable bank loans, bonds or credit default swaps.
- Credit Risk engine for calculating Mark to market price information.
- TalenD for data extraction, transformation and loading activities.
- Oracle for data mart and portfolio creation.
- Adobe Flex/Weblogic for portfolio development and job automation.
Technology: Java, Flash, IBatis, Hibernate Java Script, HTML, Servlets, JDBC, Java beans, Linux, Oracle and Eclipse.
Confidential, NC
Tech Lead/ Sr Developer
Responsibilities:
- Involved in building Portfolio Analysis and Optimization analytical platform for the Bank’s Enterprise Portfolio Confidential. This committee has the responsibility to provide oversight of portfolio optimization, forward looking analytics to ensure on-going compliance with Risk Appetite guidelines and oversight of credit management Issues. This includes an emphasis on generating recommendations that drive portfolio optimization. The Confidential effort is leveraging Confidential’s Portfolio Manager, home grown Simulation, Optimization analytics to identify risk concentrations by industry, geography and asset type (instrument), determine unexpected loss, and describe changes to portfolio value.
- Data Synapse Grid computing environment for running proprietary simulation and optimization models implemented in C++.
- TalenD for data extraction, transformation and loading activities.
- Oracle for data mart and portfolio creation.
- Adobe Flex/W eblogic for portfolio development and job automation.
Confidential, Charlotte, NC
Consultant
Responsibilities:
- Maintained and enhanced teacher pension system using Java/J2EE technologies.
- Led the development of software platform for controlling and provisioning telecom Racks, Shelves and switches using Java, EJB technologies.
- Worked on Web based applications using various technologies (Visual C++, Visual Basic, Java/J2EE )