Professional experience (16+ years) in both software/application development on OLTP (On - line Transaction Processing) and OLAP (On-line Analytical Processing) systems - Specializing in testing, implementation, and maintenance of business application development projects. Skilled at software design using entity-relationship (E/R) diagram and data warehouse schema along with data flow analysis. Proficient in production support and working with all levels. Work in Banking/financial (9+ years), pharmacy, and health care industry (6+ years).
Data Warehouse System: Red Brick, Oracle Express, Informix MetaCube
Operating System: CICS, TSO/ISPF, MVS/ESA, UNIX, VMS, CMS, QMF
System Development Utility: FileAid, Endevor, Xpeditor, SPUFI, Oracle SQL Loader, MKS Source Integrity
Database System: IMS, DB2, Oracle, Ingres, Sybase, Informix, SQL Server
Database Tools: Toad for Oracle, SQL Navigator, ERwin Data Modeler
Programming/Script Language: UNIX Shell script, Oracle PL/SQL, COBOL, JCL, Ingres Report Writer, Informix SPL, C, Focus, Oracle Personal Express
Confidential, Monterey Park, California
- Assisted Informatica users identify end-to-end ETL process issues.
- Issues occurred from receiving files, data decrypting/encrypting, in Informatica PowerCenter 9.5 and 9.6.1 performance, shell scripts, to files transfer outward, Autosys job scheduler but not limit to extract/upload data to various database platforms such as Oracle, Greenplum, and Mainframe dataset that operating through Informatica PowerExchange 9.5.
- Provided the best practices guideline in data integration methodology to Informatica Developers.
- Set up GPG (GNU Privacy Guard) in UNIX platform for internal/external data transfer.
- Introduced Informatica versioning and deployment group.
Confidential, Los Angeles, California
- Applied EPIC data knowledge in verifying business rules on ETL process. Transformed business rules to ETL process and validated outcome data. ETL process included data from Clarity, vendor (i.e.Nautilus, Concuity, etc.) Data process to both relational schemas and star schemas. Extracted Clarity data to file and send to different vendors (i.e. Optum, Crimson, etc.)
- Documented test cases and performed peer reviews on the teammate’s work.
- Designed / developed / migrated ETL processes from Clarity to staging data used Informatica 9.5 ETL. Created retrofitted massive historical data plan. Completed: Pharmacy order, Imaging order, etc.
- Provided production support. Gap analysis on data discrepancy and performance issue.
- Special projects: Evaluated outsource project. Worked on QA and UAT testing. Provided feedback on test results. Informatica repository impact analysis in relation to EPIC data model changes. Participated Informatica new version upgrade. Installed other Informatica software such as B2B Data Transformation and B2B Data Exchange. Coordinated database server migration.
- Developed data verification reports used Crystal Report. The report revealed unmatched data from source to target.
- Developed PL/SQL, the ETL process that loads data from staging to Star schema. The system framework developed by Health Care Dataworks (HCD) with data source from batch and real time combine data.
- Attended 8 EPIC class and certified 8 EPIC 2015 Clarity data model (See page 4 for details).
Consultant, Sr. Programmer
Confidential, Los Angeles, California
- Designed / developed / migrated ETL processes from Clarity to staging area and data warehouse used Informatica 8.6.1 ETL. Prepared documentation.
- Documented test cases and performed peer reviews on the team mate’s work.
Confidential, San Diego, California
Informatica ETL Developer
- Converted the existing enterprise data warehouse data from SQL Server to Oracle database used Informatica 8.6. This project primary converting Informatica mappings accordingly.
- Created unit testing plans and verifying deliverables quality.
Confidential, Redwood City, California
- Created PowerCenter MetaData customized reports with Power Center Data Analyzer Interface.
- Deployed Dashboards for easy access to reports with user-based security applied.
- Provided project documentation and customized user manual including simple maintenance issue solution.
- Developed ETL process on a new data mart that capture and report all expense, restructure and project spend data for each financial division own budget process and tools. Analyzed functional specification documents and informed discrepancy between required logic and data availability prior development .
- Performed integration tests.
- Modified ETL processes on data mart that captured employees’ and contractors’ time tracking and tracking project budgets and vendor invoice estimates data, data sourced from various database. Provided support to data model architect to amend the defined logic on User Acceptance Tests.
- Participated in the initial phase of Pershing project, handled subscribing vendor data, and data migration. Developed the ETL process that converted the Pershing files format into company standard file format. Adapted the company record balancing count functionality into the ETL process .
- Participated in Anti-Piracy data integration testing, subscribed vendor data that captured illegal data downloads.
- Re-designed the ETL error capturing and summarizing process. PowerCenter error messages were written into relational tables, one of the PowerCenter error handling features. This process linked the error message (if any) to DATA LOAD FILE ID, a unique file tag of each incoming file. Summarized and written the type of error data into a summary error table.
- Documented mappings functionary and system data flow.
- Created and tested UNIX shell scripts that verified and processed Infringement history files, one of the subscribed files. Created and tested Infringement history of related mappings, sessions, and workflows.
Confidential, Monterey Park, California
- Developed and integrated the World Check ETL process to the existing AML (Anti-Money laundering) sentinel .
- Created mappings that use different types of transformations and created mapplets to simplify the ETL development process. World Check is a data vendor provider that collects and contains data of individuals and businesses from different categories. Its unstructured data have various complex file layouts that never have other subscribers automating and integrating its data to feed downstream data systems. This Know Your Customer (KYC) compliance solution is one of the regulatory compliance mandate imposed on financial service providers.
- Tested and deployed SMART application phase 1and 2 with the business support team and Bearing Point (former KPMG). Planed, coded, and delivered the entire SMART data feed for both initial launch and constant data flow used Informatica 4.7 & 7.1. Data sourced from enterprise data warehouse and Mainframe to SMART data mart.
- Implemented data validation steps that uncovered missing data during the batch process and its error handling.
- Managed and monitored the SMART special data process that involved processing the entire customer data that sent by the legacy system.
- Coordinated with DBA to set up the Oracle replication process. Altered the SMART ETL process that accommodated system runs with or without the replication in process. Verified and validated replication process.
- Documented the SMART system data flow runbook.
- Provided production support and production backup support. SMART application was used by all 200+ branches.
- Developed and delivered data feed for the SST star schema data mart initial launch by using Informatica. Data source from EDW, Mainframe, internal Data Mart, and end-user manual adjustment input from text files.
- Created mappings that implement slowly changing incremental inserts.
- Assisted the Data Architect researches for information to build data model. Researched and analyzed users’ requirements and provided alternatives when needed.
- Analyzed GL requirements and developed the ETL code GL Data feed from Mainframe data to data warehouse within very limited timeframe along with minimal information provided. Assisted the Data Architect to research the information for building the data model.
- Assisted partial AFS (commercial loan application) data feed transforms/convert from Mainframe data to the data warehouse.
- Implemented the EDW data loading balance control process. The process aimed for data quality.
- Covered production support as needed.
- Coded complex PL/SQL programs, which handled two months of DDA data (DDA usually contains more than 5 million records with more than 3000 bytes for each record) comparisons for delta data and prepared the proper set of data for Business Objects back-end use.
Confidential, Irvine, California
- Documented new systems’ Technical Design Documents. GMP changed their OLTP to a new system with Oracle back-end. The document areas included the system overview and their inbound and outbound interface(s) details. Collected systems’ functionalism from the company technical and non-technical staffs for each system’s overview. Studied each system program code to collect the inbound and outbound interface(s)’s content.
- Participated in the ongoing modification and maintenance of the Shared Data Server (SDS), a publish and subscribe data repository. This system hosted at a HP-9000 server running HP-UX 10.20 and Sybase SQL Server 11.5 and enables multiple reporting applications to subscribe to the same data elements of the core accounting and portfolio management systems. Analyzed, programed, tested, and implemented users' requests.
- Developed the CityData Phase III data warehouse. Responsible for the programming, testing, and implementation of the customer monthly activity summary tables. CityData hosted at the RedBrick data warehouse environment using SQR as the extract, translate and load tool with Brio query as the front-end access tool.
- Tested data quality in the OLTP system contains massive data that come from various application in Mainframe legacy systems such as VSAM, IMS. These data sources included the various types of accounts: checking, savings fixed deposits, offshore deposits, installment loans, hire purchases, and trade finance.
- Participated in full the development lifecycle and converted historical data from IMS to DB2 system.
- Developed and maintained users’ requested reports and files in BD2 CICS COBOL.
- Production support. Investigated and resolved data discrepancy.
- Documented program specifications and computer operation manual.
- Performed system integration test and regression testing before each roll out to avoid data discrepancy.
- Customized RM reports using API (Application Programming Interface) to extract the necessary system data to produce reports by using RPI (Report Programming Interface). Prepared end-user manual.