Cloudera/hadoop Developer Resume
Pleasanton, CA
PROFILE SUMMARY:
- Business driven and result oriented Technologist, with 13+ years of IT experience spread over Architecture, design and development of enterprise Legacy and ETL DWBI applications comprising in depth business knowledge in Payment Systems, Retail Banking & Securities, Healthcare and Pharmacy systems, Strategic Project Management.
- Specializes in data warehousing, Relational Database modelling, ETL methodology and Business Intelligence infrastructure building, data migration, and relational database systems development.
- Passionate and having ability to capture deep understanding of underlying data from various source systems and capable of developing business need based complex but optimized data repository empowering BI tools to give customer an edge to make strategic decisions which in turn helps them to grow their business segments.
- Cloudera Certified Hadoop Developer, Cloudera Certified Hadoop Administrator.
- Sound programming capability using core JAVA along with Hadoop framework utilizing Cloudera Hadoop Ecosystem projects (HDFS, Spark, Sqoop, Hive, Impala, etc.).
- A successful and experienced Project Management Professional (PMP) managing multiple portfolios/teams in Confidential Healthcare system.
- Has strong experience/expertise in different data - warehouse tools including ETL tools like Ab Initio, Informatica, etc. and BI tools like Cognos, Microstrategy, Tableau and Relational Database systems like Oracle/PL/SQL, Unix Shell scripting.
- Has strong expertise/experience working in Legacy IBM Mainframe z O/S/MVS, COBOL, DB2, CICS, JCL, and Enterprise level job schedulers like Control M, Autosys and Tivoli.
- Currently leading and managing several Onshore Development projects across OPPR program/Pharmacy Data Warehouse for Confidential Healthcare Applications.
CORE COMPETENCIES:
Designer/Architect; Solution Consultant - Expert in Data Warehouse Architecture and Design
ETL Architecture - Ab Initito, Informatica 9.x, Unix Shell Scripting
IBM Certified Big Data Consultant
Extensively working in Cloudera Hadoop Framework utilizing Hadoop, Scoop, Flume, Hive, HBase, Impala, Yarn MR2, Zookeeper, Oozie, NoSQL databases.
Project Management Professional(PMP)
Agile Development Methodology(Rally)
Data Modelling
Oracle - PL/SQL
Mainframe, MVS Z O/S, DB2, JCL, VSAM, Infoman, BMC Remedy, COBOL, FILEAID, DFSORT, CICS, SDSF/IOF,TSO/ISPF.
Java Enterprise Edition(JEE), C, C++ Programming
Microstrategy, Cognos, Tableau BI Reporting Tools
Control M, TWS, Autosys Scheduler
SDLC management/tracking using HP Quality Center(QC)
ENGAGEMENT OVERVIEW:
Confidential, Pleasanton, CA
Cloudera/Hadoop Developer
Responsibilities:
- Actively involved in designing, developing and implementing a Data Repository to retain last 10 years clinical data, comprising of pharmacy point of sale transactions, covering various scenarios like Sale, Void/Unsell, Return, etc. CDR is much valuable for Confidential to identify various kind of patterns like Fraud (prescription drug being stolen in pharmacy), trends in returns, max wait time in a particular pharmacy etc., which altogether helps business to improve the patient care/service at Pharmacy and prevent proactively events of fraud etc.
- Cloudera Apache Hadoop Ecosystem - Cloudera Hadoop Framework, Hive, HDFS, Impala, Sqoop, Spark, UNIX AIX, Oracle 11g, Flat Files, and IBM AIX.
- Worked with Cloudera admin team to efficiently build a multi node cluster and configure HDFS to give the best query performance.
Confidential, Pleasanton, CA
Data Warehouse Lead
Responsibilities:
- Designed, developed and supported the implementation of extraction, transformation and load process (ETL) for the Data Warehouse from heterogeneous source systems using Informatica 8.6.1.
- Worked in various PDW enhancement projects, designing the ETL solutions individually and successfully implementing the same.
- Actively involved in tuning performance for existing and new source queries, Informatica Workflows.
- EPS General Ledger (GL), POS Transaction Data Store (TDS), Gateway, EED Cutover, Performance tuning of EPS. PDW applications generate various downstream feeds for regions which help them to perform various pharmacy/clinical data research & General Ledger is extremely important for Confidential Finance team for auditing the Prescription Sales Revenue from Pharmacies across all regions.
- Used platforms includes: Informatica Power Center 8.6.1, Oracle 11g/10g, IBM Mainframe, SQL, UNIX Korn Shell, HP Quality Center 11.52, Rally (Agile Methodology), IBM TWS Scheduler, Flat Files, and IBM AIX.
- Tuned performance of several EPS ETL process, improving the ETL job execution time significantly, which helped delivering feeds to downstream subscribers meeting the SLO.
Confidential, Pasadena, CA
Development Lead
Responsibilities:
- Developed and implemented ETL solutions for LIS Acumen Business Validation Automation process and further in LACARE business rules using ETL tool Ab Initio and Informatica.
- Having excellent customer and business interface, involved in Customer business requirement gathering meetings and automated business validation audits to enhance the customer feed experience.
- Performance tuning of existing ETL process.
- Used platforms includes: Ab Initio, Informatica Power Center 8.6.1, Windows-7, Oracle 10g/9i, IBM Mainframe, SQL, IBM DB2, Korn Shell, IBM TWS Scheduler, MS-Office Tools, Flat Files, and IBM AIX.
- Received accolades from Business for significant contribution in building the ETL process for applying complex automation business rules of LACMN.
Confidential, Pasadena, CA
Development Lead
Responsibilities:
- Developed and implemented ETL solutions for building Part D Enrolment/ Disenrollment Business reporting process and further in the Exclusion of LACARE Dual Demo Members, using ETL tool Ab Initio.
- Having excellent customer and business interface, involved in Customer business requirement gathering meetings and automated business validation audits to enhance the customer feed experience. Performance tuning of existing ETL process.
- Technical platforms used includes: Ab Initio GDE, DB2, SQL Server, UNIX AIX, Shell scripting, Mainframe, IBM TWS Scheduler.
- Received customer accolades for significant contribution towards building the ETL process with the best possible solution approach.
Confidential, Pasadena, CA
Lead Developer
Responsibilities:
- Developed and implemented ETL solutions for building Medicare Letter Optimization (MLO) Process using ETL tool Ab Initio.
- Having excellent customer and business interface, involved in Customer business requirement gathering meetings and automated business validation audits to enhance the customer feed experience.
- Performance tuning of existing ETL process.
- Technical platforms used includes: Ab Initio GDE, DB2, SQL Server, UNIX AIX, Shell scripting, Mainframe, IBM TWS Scheduler.
- Played crucial role for developing and enhancing the Mainframe application interface to unload huge source tables for data.
Confidential, Jersey City, NJ
ETL Architect/Designer
Responsibilities:
- Lead, designed and implemented key business rules for GIW-GLA (Global Liability Analytics) application using ETL/Ab Initio.
- Later worked as a Technical Architect to design/build SFS Data mart from scratch, which comprises of various security lending and other security assets managed by Confidential .
- In this project, I was involved in requirement gathering, data modelling, creating business rules mapping, interacting with Microstrategy reporting teams to implement/build the best possible Datamart consisting a number of Fact, Dimension and Summary tables based on customer business needs.
- Technical platforms used includes: Ab Initio GDE, Oracle/SQL Server, and UNIX AIX, Shell scripting, Mainframe, IBM TWS Scheduler, and Microstrategy.
Confidential, Foster City, CA
Lead ETL & Microstrategy Reporting Developer
Responsibilities:
- Enhanced key ETL/business modules for Visanet Incentive Network data mart and later worked in building key business intelligence reports for Visanet Incentive Network (VIN) loyalty campaigns for merchants and member banks.
- Created and managed lifecycle of APRIMO Sweepstakes and Campaigns.
- Took initiative in fine-tuning performance for long running Microstrategy Reporting Queries.
- Achieved outstanding performance for long running Microstrategy Reporting SQL(s) by query tuning, which helped business users in getting response to their Data quests in a timely manner.