We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

2.00/5 (Submit Your Rating)

Holmdel New, JerseY

SUMMARY:

I am seeking employment as an Informatica developer. I have more than 17 years of experience with Informatica 4.7 to 10.0.1, Oracle, TOAD, SQL, PL/SQL, Unix Ksh, Autosys, Mainframe MVS COBOL, Informatica IDQ 9.0.1, Informatica PowerExchange for IMS, DB2 and VSAM, AML, JCL, Sybase, SAP, TIDAL, SQL Server, DB2/UDB, UNIX, Microfocus PC COBOL and Microsoft Office products.

EXPERIENCE:

Confidential, Holmdel, New Jersey

Senior Informatica Developer

Responsibilities:

  • Responsible for development, testing and support of numerous workflows loading a Credit Risk Data Warehouse including 14 Dimensions and Fact tables.
  • One of the more complex mappings joined 7 input files created by the Santander Loans processing system, SCUSA.
  • To accurately calculate daily risk attributes, we must join the daily, monthly, previous month, bankruptcy, repossession data, borrower information, Static data, including the previous days Union file created from this mapping, were required to be joined to properly calculate various risk attributes accurately.
  • The Union file generated contains numerous attributes related to each contract which is used as a source in multiple downstream mappings to load the Credit Risk Datamart.
  • In testing I identified there were duplicate entries in the borrower data coming from the source system that was causing duplicates records to be generated during Dimension and Fact loads. I discussed the issue with the business and coded to route the multiple entries to an error table so to not impact subsequent loads.
  • I also provided load validation reports to ensure data quality.
  • I created a shell script to get the previous file names and update the Informatica parameter file input parameter entries, so the correct files were read and processed.
  • Informatica 9.6.1 Oracle 12c.

Confidential, Brandon, Florida

Senior Informatica developer

Responsibilities:

  • Enhanced, more granular level report that identifies fluctuations of alert volumes.
  • Enhanced report layout and readability.
  • Short and long - term trending analysis on segment and scenario level.
  • Current and historical alert counts.
  • Daily, weekly and bi-weekly alerts.
  • Mantas Alert Monitoring to Citi Alert Case Management system.
  • Mitigate existing shortfalls in current BDC report which creates too many false positive alerts which required the involvement of the case management team.
  • Created unix script for automated migration of Informatica object XML.
  • Informatica 9.5.0, Oracle, Unix.

Confidential, Jersey City, New Jersey

Senior Informatica Developer

Responsibilities:

  • An inventory of all data sources used in this process had data for all processing days required for the lookback period.
  • This analysis revealed there were a few days that we were missing for a few sources. There was an executive decision as to how to handle those days based upon my analysis.
  • The warehouse is in DB2. This presented a few challenges to overcome, to come up with a design that would allow us to create reports to show all source records were transformed and loaded into Actimize for each day of processing.
  • To prove all historical data was loaded in to the target database, I created a series of workflows to reverse replicate SQL server target and reference data (Lookups) back to DB2. Queries were then created to join source, lookup and target records together in a report to prove all source data was transformed and loaded into the target DB as designed. The outcome of these reports was inserted into validation history tables so each day’s validation processing was retained.
  • Another Informatica workflow was created to read these validation reports and trigger a failure routine passing an error message with the reason of the validation error automatically.
  • The design of this validation process was so successful that it was also implemented in the daily production load process going forward.
  • Informatica 9.1.0, DB2, Sql Server.
  • Development lead on project.
  • Architected entire process.
  • Team of 4 developers.
  • Created Mappings and Workflows, Database Objects, Process Validation SQL, Data Validation SQL

Confidential

Informatica Lead Developer

Responsibilities:

  • I was responsible for conversion of the positions and balances application, Confidential, as well as Confidential .
  • I converted the ETI code (mainframe code generator) to run in the distributed environment in Informatica.
  • Learning to read ETI, Legacy COBOL that it generated, and converting that into Informatica mappings was a very tedious, challenging task.
  • Programs had to be written to unpack mainframe data so that output from the old output files from mainframe to the new Informatica workflow could be compared using a file compare utility.
  • Informatica 9.1.0, DB2
  • Converted ETI COBOL Generator code to Informatica mappings.
  • Created COBOL programs to unpack datasets on MVS to compare Mainframe output data to output of Informatica.
  • Informatica 9.1.0, DB2, Sql Server.
  • Designed, developed, tested and implemented dynamic audit log parsing routine using Informatica Powercenter.

Confidential, New York, New York

Informatica Lead Developer

Responsibilities:

  • Worked directly with Business Analysts and Data Modelers to understand how the data needed to be transformed and validated. I also assisted business analysts to perfect their rules where issues were identified.
  • Created reusable mapplets and worklets where common processes can be reused for the most efficient management of Informatica code.
  • Assisted team as needed with more complicated Informatica mapping logic.
  • Created Informatica common Informatica Worklets to validate incoming vendor zip files to unzip, verify all files within the zip file and compute hash totals to verify all data is complete as defined by the business. This process was created to have the flexibility to handle all 60+ vendor feeds Confidential receives. If a given zip file does not contain all files required for processing or does not match hash totals the batch will fail and error emails will be sent.
  • Created reusable mapplets for validation of common data elements for premium and claims data loads.
  • Worked with Confidential to set up their Informatica Repository folder structure to most effectively take advantage of the power of Informatica reusability and processing efficiency.
  • Informatica 9.1.0, Windows, Sql Server.
  • Lead a team of three developers.

Confidential, Jersey City, New Jersey

Lead Informatica Developer

Responsibilities:

  • Prior to this project there was no consolidated way to view what unencumbered assets were available, so they had no alternative to pay brokerage fees associated with execution of the trade.
  • Worked directly with Source system SME’s, Business Analysts, and the data modeler to understand how the data needed to be transformed.
  • Created logical design documents for all mappings to describe sources, targets and the transformations applied to convert global trading and settlement system data loaded to the warehouse.
  • Provided validation reports generated by sql scripts to prove that the ETL processes are functioning as designed.
  • The reports showed all appropriate filtering and conversions where applied properly.
  • Based upon these reports sign off from the business was received.
  • Informatica 9.0.1, Unix, Sybase, Oracle, Autosys.
  • Lead a team of three developers.

Confidential, New York, New York

Lead Informatica Developer

Responsibilities:

  • Address standardization with the Informatica Address Validator is not an exact science.
  • In my testing I saw the need to enhance parsing in every market which we standardized.
  • If the address validator did not return a “perfect” matchcode there were improvements to address parsing that could be made in some cases.
  • It seemed that “out of the box” standardization had trouble parsing some elements in almost every market at times even though the matchcode returned was “good.”
  • To ensure greater than 98% accuracy in the address parsing, a high degree of data analysis was performed to identify data patterns and address element key words per market to parse an address into its individual elements. Testing of the process involved hundreds of unit tests using hundreds of thousands of source address lines.
  • Custom parsing for each market included reference tables I created per market, to identify key words to know from where to extract address elements like apartment numbers (‘apt’, ‘suite’, ‘unit’, etc.) PO boxes, street suffix, city, state, zip code. Data pattern identification was also used to more accurately determine the contents of an incoming address line. For instance, if an address line did not begin with a numeric and did not contain a known street suffix the next address line was interrogated to determine if it contained address elements for the most accurate data parsing possible.
  • Involved in server configuration for IDQ components and debugging performance related issues.
  • Other data elements standardized included DOB, Names, and Company Names to name a few.
  • Created logical design documentation to describe the address standardization and matching process. From this document the UAT testing team was able to create their test scenarios.
  • Intimately involved in unit and SIT testing. UAT testing was performed by an independent testing team within Confidential .
  • The name and address standardization was created in the Informatica developer and imported into PowerCenter as a mapplet and ran in real time. The application engine runs in MVS. Requests were submitted thru MQ Series and triggered the power center workflow to run. Results were then sent back to the mainframe via MQ Series.
  • Worked directly with the Confidential business team to build the fraud matching plan for 12 markets.
  • Informatica 9.0.1, SQL, SAS, Unix, Oracle.
  • Lead a team of 3 developers.

Confidential, New York, New York

Lead Informatica Developer

Responsibilities:

  • Informatica 8.6, DB2 and UDB.
  • Lead a team of 3 developers. I also helped other Informatica teams with complicated Informatica processes as needed.
  • Converted several key processes from mainframe COBOL to Informatica.
  • Converted Tablebase programs to Informatica. Tablebase was not supported in the distributed environment.
  • I created a parameterized dynamic DB2/UDB Informatica data profiling workflow to verify data from the mainframe was migrated to the distributed environment accurately. Once Informatica profiled the data a SQL report would interpret the profiled data and the results would be written to a delimited file. A shell script would read the report and based upon what was read in the report the process succeeds or a failure routine is triggered sending emails and fails the Informatica workflow executed in Autosys.

Confidential, New York, New York

Lead Informatica Developer

Responsibilities:

  • Lead a team of 5 ETL developers.
  • I provided them with logical and physical design specs and worked closely with them thru the more difficult development processes.
  • I also would develop the most complicated ETL processes.
  • Architected the overall process from extracting the legacy data from various systems ranging from MVS mainframe, Oracle RDB and flat file extracts.
  • The architecture included batch extract data management.
  • The legacy data ETL was designed similar to a standard type-2 dimension, enabling change capture and processing of only new and changed rows for processing efficiency.
  • Worked with Legacy systems personnel to establish connectivity on many platforms including IBM MVS COBOL/CICS, Oracle RDB, AS400, Oracle and flat file extracts from systems that had unique proprietary software.
  • Reviewed Informatica mappings to ensure proper performance tuning.
  • I eliminated unnecessary ports in lookups, verified source data was located on the local disk for faster data retrieval.
  • I also made sure there are no transformation errors which can drastically decrease performance.
  • Validation of the Informatica conversion process was designed in three stages:

Confidential, Jacksonville, Florida

Lead Informatica Developer

Responsibilities:

  • Designed and developed common data extraction routine for over 200 home and auto endorsements.
  • The most complex ETL development was in the endorsement extraction.
  • They were stored in a common 500 byte field that varied on the endorsement type.
  • There were about 20 common Cobol copybooks for these endorsements.
  • Some of the data in occurs clauses further complicated the data transformation.
  • I created a lookup table to store the displacements of these endorsements in this field.
  • These displacements were fed into multiple dynamically created substring ports to extract this data accurately.
  • The data type also had to be in the lookup table to properly load the target.
  • Other sources included pipe delimited flat files from credit reporting agencies and bureau of motor vehicles.
  • I assigned more straight-forward mappings to junior developers and monitored their progress.
  • I answered questions as they came across issues they were not able to resolve.
  • Reviewed Informatica mappings to ensure proper performance.
  • I eliminated unnecessary ports in lookups, verified source data was located on the local disk for fastest data retrieval.
  • I also made sure there are no transformation errors which can have a negative impact on performance.
  • The conversion database environment was located on the same server as the Informatica server so there were no network performance related issues.
  • Loaded multiple slowly changing dimensions including vehicle, underwriting, policy, address, endorsement detail, home and fact table with written and earned premium measures.

Confidential, New York, New York

Lead Informatica Developer

Responsibilities:

  • Worked directly with Informatica developers to resolve complex mapping logic. Created reusable mapplets and expressions for common processes in all mappings in the repositories. Identified mapping logic that could be tuned for performance gains.
  • Developed and implemented logical to physical documentation standards for all Informatica processes to their lowest levels of granularity. This should serve to eliminate discrepancies between business and technology.
  • Put in place migration procedures to ensure constant synchronization between development, user acceptance and production environments.
  • Controlled access and security to all Informatica Repositories and responsible for all migrations to multiple development, user acceptance, production repositories with the proper business approvals.
  • Trained and provided support to the Peoplesoft user community, showing how to setup sessions, uploading to list files and execution from the Peoplesoft front end.
  • Key participant in identification of required sources for meeting the goal of compensation based on participant portfolios. Solely responsible for translation of business requirements into technical specifications and writing the corresponding documentation.
  • Worked directly with Confidential personnel to create efficient and accurate data mapping sessions to move data from JPMorgan sources to the appropriate repository tables within the Incentive Compensation Management Repository.
  • Created highly detailed documentation for moving data to the JPMorgan Compensation Repository and Data Warehouse. This documentation encompassed the original source system feeds to loading the Compensation Repository and then to the JPMorgan Data Warehouse. All business rules were included and the physical data transformation logic that was driven by the capture of these rules.
  • Designed and developed a series of data mappings to incrementally load the Incentive Compensation Repository and Data Warehouse, including slowly changing dimensions.
  • Provided examples of creating reusable transformations, unconnected lookups and altering of tool generated SQL within objects for increased Performance efficiency.
  • Held walkthroughs of mapping designs with JPMorgan/Chase developers to provide knowledge transfer and best practices in terms of performance using Informatica PowerCenter 6.2.

We'd love your feedback!