Informatica Engineer/administrator Resume
CaliforniA
SUMMARY:
- Professional experience (16+ years) in both software/application development on OLTP (On - line Transaction Processing) and OLAP (On-line Analytical Processing) systems - Specializing in testing, implementation, and maintenance of business application development projects. Skilled at software design using entity-relationship (E/R) diagram and data warehouse schema along with data flow analysis. Proficient in production support and working with all levels. Certified Oracle 10g Certified Professional (OCP). Work in Banking/financial (9+ years), pharmacy, and health care industry (6+ years).
TECHNICAL SKILLS:
Data Warehouse System: Red Brick, Oracle Express, Informix MetaCube
Operating System: CICS, TSO/ISPF, MVS/ESA, UNIX, VMS, CMS, QMF
System Development Utility: FileAid, Endevor, Xpeditor, SPUFI, Oracle SQL Loader, MKS Source Integrity
Database System: IMS, DB2, Oracle, Ingres, Sybase, Informix, SQL Server
Database Tools: Toad for Oracle, SQL Navigator, ERwin Data Modeler
Programming/Script Language: UNIX Shell script, Oracle PL/SQL, COBOL, JCL, Ingres Report Writer, Informix SPL, C, Focus, Oracle Personal Express
PROFESSIONAL EXPERIENCE:
Confidential, California
Informatica Engineer/Administrator
Responsibilities:
- Assisting Informatica users identify end-to-end ETL process issues, any issues occur from receiving files, data decrypting/encrypting, in Informatica PowerCenter 9.5 and 9.6.1 performance, shell scripts, to files transfer outward, Autosys job scheduler but not limit to extract/upload data to various database platforms such as Oracle, Greenplum, and Mainframe dataset through Informatica PowerExchange 9.5..
- Providing the best practices in data integration methodology to Informatica end-users.
- Setting up GPG (GNU Privacy Guard) in UNIX platform for internal/external data transfer.
- Introducing Informatica versioning and deployment group.
Confidential, Los Angeles, California
Sr. Programmer/Analyst
Responsibilities:
- Applying EPIC data knowledge to ETL process, the ETL process including data source from internal Clarity(Epic) and external file (i.e.Nautilus, Concuity, etc.) to Data Warehouse (CSDW) and Enterprise Data Warehouse (EDW). Data process to both relational schemas and star schemas. Extracting data from data warehouse to create outbound file to differ vendor such Optum, Crimson,etc.
- Designing / developing / migrating ETL processes base on requirement from Clarity to CSDW staging area, CSDW, EDW, various data mart, and various data extract using Informatica 9.5 ETL. Testing and peer reviewing ETL code. Retrofit massive historical data. Completing the following: Pharmacy order, Imaging order, etc.
- Providing production support. Researching data discrepancy and other data process issue such as performance issue. Recommending resolutions (if any).
- Working on special projects including as analyzing impact data fields in Informatica repository in relation to EPIC upgrade(s), participating in new version upgrade of Informatica, installing other Informatica software such as B2B Data Transformation and B2B Data Exchange, coordinating on database server migration from one server to another.
- Developing data verification reports using Crystal Report.
- Developing PL/SQL, the ETL process that loads data from EDW staging to EDW Star schema. The system framework developed by Health Care Dataworks (HCD) with data source from batch and real time combine data.
- Attended 8 EPIC Clarity data model s: Resolute Hospital Billing, ADT and Prelude, Willow In Confidential t, Tapestry with AP Claims, Beaker, Health Information Management, and EpicCare In Confidential t, OpTime
- Certified 8 EPIC 2015 Clarity data model: Resolute Hospital Billing, Resolute Professional Billing, ADT and Prelude, Willow In Confidential t, Tapestry with AP Claims, Beaker, Health Information Management, EpicCare In Confidential t.
Confidential, Los Angeles, California
Consultant, Sr. Programmer (contract position)
Responsibilities:
- Designing / developing / migrating ETL processes base on business requirement to the Data Warehouse staging area and Data Warehouse using Informatica 8.6.1 ETL. Preparing documentation.
- Testing and peer reviewing ETL code.
Confidential, San Diego, California
Informatica ETL Developer (4 weeks Contract)
Responsibilities:
- Migrating the existing enterprise data warehouse data from SQL Server to Oracle database uses Informatica 8.6.
- Creating unit testing plans.
Confidential
Responsibilities:
- Creating PowerCenter MetaData Customized Reports with Power Center Data Analyzer Interface.
- Deploying Dashboards for easy access to reports with user-based security applied.
- Providing project documentation and customizing user manual for non-complicated maintenance issue solution.
- Developing ETL process on a new data mart that capture and report all expense, restructure and project spend data for each financial division own budget process and tools. Analyzing functional specification documents and informed discrepancy between required logic and data availability prior development.
- Participating in status meetings to provide updates on development status and project progress.
- Performing integration tests.
- Modifying ETL processes on a semi-production data mart that captured employees’ and contractors’ time tracking and tracking project budgets and vendor invoice estimates data, data sourced from various database. Providing support to data model architect to amend the previous defined logic on User Acceptance Tests.
- Participating in the initial phase of Pershing, subscribing vendor data, and data migration. Developing the ETL process that convert the Pershing files format into company standard file format. Adapting the company record balancing count functionality into the ETL process.
- Participating in the system integration testing in Anti-Piracy data, subscribing vendor data that capture illegal data downloads.
- Re-designing the ETL error capturing and summarizing process. PowerCenter error messages were written into relational tables, one of the PowerCenter error handling features. This process linked the error message (if any) to DATA LOAD FILE ID, a unique file tag of each incoming file. Summarizing and writing the type of error data into a summary error table.
- Creating documentation for developing mappings.
- Creating and testing UNIX shell scripts that verified and processed Infringement history files, one of the subscribed files. Creating and testing Infringement history of related mappings, sessions, and workflows.
Confidential, California
System Specialist
Responsibilities:
- Developing and integrating the World Check ETL process to the existing AML (Anti-Money laundering) sentinel.
- Managing and monitoring the SMART special data process on occasion such as provided solutions to process the entire customer data sent by the legacy system.
- Coordinating with DBA to set up the Oracle replication process. Altering the SMART ETL process to be able to run with or without the replication in process.
- Documenting the SMART system data flow and run procedures.
- Providing production support and backup production support. SMART application is used by all 200+ branches.
- Analyzing GL requirements and developing the ETL code GL Data feed from Mainframe data to data warehouse within very limited timeframe along with minimal information provided. Assisting the Data Architect to research the information for building the data model.
- Assisting partial AFS (commercial loan application) data feed transforms from Mainframe data to the data warehouse.
- Implementing the EDW data loading balance control process.
- Covering production support as needed.
- Coding complex PL/SQL programs, which handled two months of DDA data (DDA usually contains more than 5 million records with more than 3000 bytes for each record) comparisons and preparing the proper set of data for Business Objects back-end use.
Confidential, Irvine, California
Programmer Analyst
Responsibilities:
- Documenting new systems’ Technical Design Documents. GMP has changed their OLTP to a new system with Oracle back-end. The document areas included the system overview and their inbound and outbound interface(s) details. Collecting systems’ functionalism from the company technical and non-technical staffs for each system’s overview. Studying each system program code to collect the inbound and outbound interface(s)’s content.
- Participating in the ongoing modification and maintenance of the Shared Data Server (SDS), a publish and subscribe data repository. This system is hosted on a HP-9000 server running HP-UX 10.20 and Sybase SQL Server 11.5 and enables multiple reporting applications to subscribe to the same data elements of the core accounting and portfolio management systems. Analyzing, programing, testing, and implementing users' requests.
- Developing the CityData Phase III data warehouse. Responsible for the programming, testing, and implementation of the customer monthly activity summary tables. CityData is hosted on the RedBrick data warehouse environment using SQR as the extract, translate and load tool with Brio query as the front-end access tool.
Confidential
Analyst Programmer
Responsibilities:
- Analyzing users’ requirements.
- Testing the data quality in the OLTP system contains massive data that come from various application sources from Mainframe legacy systems such as VSAM, IMS. These data sources included the various types of accounts: checking, savings fixed deposits, offshore deposits, installment loans, hire purchases, and trade finance.
- Participating in full the development cycle and data conversion to new system, data converted from IMS to DB2 system.
- Developing and maintaining users’ requested reports and files in BD2 CICS COBOL.
- Investigating data discrepancy.
- Documenting program specifications and computer operation manual.