We provide IT Staff Augmentation Services!

Lead Informatica Developer Resume

3.00/5 (Submit Your Rating)

Experience Summary:

10 plus years of experience in IT Industry focusing on Business Analysis, Requirements, Development, Testing, Deployment and Administration of Data Warehousing using industry accepted methodologies and procedures. Expertise in Business Intelligence Utilizing tools like Informatica PowerCenter, Business Objects, QlikView. Also

  • Designed and Developed the Data warehouses and Datamarts conceptualized on Bill inmon and Ralph Kimball methodologies
  • Constantly interacted with business analysts to understand the reporting requirements
  • Analyzed source and target systems and prepared technical specification for the development of Informatica ETL mappings to load data into various tables.
  • Ensured Service Level Agreements (SLA) with business are met or exceeded and escalated internally or externally when necessary for resolution of day-to-day operational production support activities.
  • Designed and Developed Business Objects universes/reports to address business needs.
  • Administered the Informatica repository by creating and managing user profiles and metadata
  • Administered the Business Objects by creating and managing user profiles and metadata
  • Designed dimensional models/star schemas to populate the Datamarts across the Enterprise.
  • Established working relationship with various vendors that helped in successful delivery of the projects.
  • Lead a team of Junior Developers/Consultants to deliver a critical financial project for the company.
  • Established coding standards/naming standards in various technologies like Informatica, Business Objects, MS SQL Server.
  • Designed and Developed routines using Data Quality project architect for MDM.
  • Designed and Developed solutions to protect Private information by Encrypting and decrypting the data movement from the company to the vendor and vice versa.
  • Designed and developed a strategy to populate the subject area hierarchies into data warehouse and datamarts according to the reporting needs.
  • Designed and developed the data transformations for source system data extraction, data staging, data movement and data aggregation.
  • Hands on data profiling tools like data flux, Informatica
  • Implemented error routines to handle incorrect data.
  • Extracted data from ERP sources using Informatica PowerConnect
  • Expertise in performance tuning at Target, source, mapping, session, system levels.
  • Developed Informatica mapping to capture data changes from the operational source systems into data warehouse and datamarts (slowly changing dimensions).
  • Developed Mapplets/transformations embedding business logic
  • Created source and target partitions to concurrently load the data into data warehouse and datamarts.
  • Created shared folders, local and global shortcuts to propagate changes across the repository
  • Used SQL tools like TOAD, Query Analyzer to run SQL queries and validate the data in warehouse and mart
  • Developed shell /python scripts to handle incremental loads
  • Experience in scheduling Informatica sessions, job dependencies using AUTOSYS.
  • Experience in designing and developing the Universe(business view of the database), developing canned/ad-hoc reports, scheduling the processes using Broadcast Agent and administering the BO activities
  • Created reports using Business Objects functionality like Combined Queries, Slice and Dice, Drill Down, Functions, Cross Tab, Master Detail and Formulae etc
  • Experience in developing Packages, procedures, triggers and functions using PL SQL, T-SQL.

Technical Environment:

ETL Tools: Informatica PowerCenter 8.1/8.0/7.1/6.2/5.1, DTS, SSIS, PowerConnect SAP R/3
Databases: Oracle 9i/8i/8.0/7.x, MS SQL Server 2008/2005/2000//7.0/6.5, DB2, MSAccess 97/2000
Reporting Tools: Business Objects XI/5.1/5.0, Data Reports, QlikView
Query Tools: TOAD, SQL*LOADER, Query Analyzer
Languages: Python, SQL, PL/SQL, Transact SQL, SQL*Plus8.0, Visual Basic .NET/6/5.0, Java2, HTML, DHTML, ASP, C
Web servers: Apache, IIS, and Tomcat
Data Quality: Business Objects XI Data Quality 11.7
Data profiling: Informatica
Version Control: PVCS, TFS
Methodologies: Waterfall, Agile
OS: MS Windows 2000/NT/98/XP, Unix
Text Editors: Notepad++, Ultra Edit, XML Spy
Other software: MS word, Excel, PowerPoint, MS Visio, RedGate
Industries: Financial, Insurance, Manufacturing

Project Experience:

Confidential, Beverly, MA Aug '05 – Till date
Confidential, established in 1966, is a direct writer of private passenger automobile and homeowner\'s insurance and has been a GE benefit provider for over 35 years. Confidential provides insurance coverage on Personal Lines (automobile, homeowner, renter, excess liability, and watercraft insurance) and Commercial Lines. EIC provides insurance coverage to General Electric and its employees. EIC also plays as an insurer, reinsurer and third-party administrator for all of GE\'s commercial liability claims.

Roles Held: Business Intelligence and Reporting Developer, Lead Informatica Developer
Sr Solution Development Engineer, Data Warehouse Expert, ETL Architect

Workers Compensation Implementation in Juris:
Move WC claim handling to the claim system hosted by Sedgwick. The Juris claim system is supported by SIR electronic document management making this a 360 degree solution to resolve WC pain points. Managing claims using the TPA's own system creates multiple opportunities not available to Electric via Pinnacle such as: Web access to GE users, volume discounts, staffing flexibility, elimination of paper files, and additional business development opportunity. Through the Juris platform Electric gains immediate access to Sedgwick's performance based medical networks, offering opportunities to decrease medical and indemnity loss costs for GE. This project significantly reduces enterprise risk as EDI and other manual compliance process become fully automated through Juris. Sedgwick's compliance department provides improved access to all current state regulations shifting compliance ownership to the vendor. Cost certainty is gained, as new regulatory requirements will not have to be built or maintained by Electric. As a result it was data warehouse responsibility to get the data back from Juris on a nightly basis and fit it to the existing datamart structures for reporting needs.

Responsibilities:

  • Design and Develop Informatica processes to extract the data from Juris Feed on a daily basis
  • Design and physicalize the Datamodel for Enterprise Data warehouse
  • Estimate the Database space requirements for the DBA s based on the vendor input and existing workers comp data
  • Work closely with the Business Analyst to come up with the source – target document to map the juris data to the existing data structures.
  • Develop python scripts to separate the various Rec types provided by the vendor –Sedgwick
  • Work closely with the project management to come up with a detail task list needed to support the development work
  • Design and Develop control mechanisms to ensure the data accuracy between juris and Electric Insurance data marts
  • Develop ETL standards for Efficient code /re-usability
  • Developed SSIS packages to baseline production data into Dev/QA for unit and regression testing purposes
  • Create Documentation as needed for production support teams.
Environment: Informatica PowerCenter 8.1.1, SQL Server 2008/2005, Business Objects XI, TFS, Quality Center, Windows XP, Microsoft Visio, Python2.5, Autosys

Auto Datamart:
The project is a result of a new front end implementation for the auto line of business.The business wanted to capture all the meaningful business process metrics to get a better insight into the quote/policy life cycle.Some of the key metrics include quotes completed, open quotes, declines, bind starts, binds, quote errors .etc.The project also extends to capture the response time, time spent on each page etc..

Responsibilities:

  • Developed a python script to initiate a web service call that will further extract the operational data in XML form and load it into the SQL tables
  • Used informatica to parse out the xml data into the datamart structures that is further utilized for the reporting needs.
  • Designed and developed Business objects universe to cater to the reporting needs.
  • Developed hierarchy model using the agency structures provided from the operational world
  • Designed the Datamodel for EDW and Datamart
  • Estimated the space requirements for DB
  • Physicalized the EDW and Datamart and added indexes as needed for performance.
  • Implemented slowly changing dimensions to capture history as needed
  • Helped QA and Business Users regression test the Datamart and validate test results.
Environment: Informatica PowerCenter 8.1.1, SQL Server 2008/2005, Business Objects XI, TFS, Quality Center, Windows XP, Microsoft Visio, Python2.5, Autosys

Medicare Project:
Achieve compliance with regulatory requirements and specifications provided in Section 111 Mandatory Medicare Secondary
Payer Reporting; avoid risk of fines for non-compliance ($1,000/day/claim).
Collect and report Medicare data on all injury claims (BI Liability, WC, No-Fault, UIM, MedPay) for claimants who are "Medicare Beneficiaries" to the Centers for Medicaid and Medicare Services (CMS) coordination of benefits contractor (COBC); handle errors and rejections received from COBC; provide functionality to query CMS for "Medicare Beneficiary“ status and process responses. Support business processes needed to ensure collection of required data via Claims applications (Pinnacle, CMS+, TOPS, MATS) and control reports (Business Objects).

Responsibilities:

  • Developed Control mechanisms to ensure the data transmission to Vendor Crowe Parades
  • Constantly interacted with claims systems application users to understand and transform the data For Medicare reporting needs.
  • Worked closely with the vendor to keep up with the constantly changing requirements
  • Developed Informatica processes to gather data from various claims systems and send it over to vendor
  • Built Autosys schedule to execute the nightly feed over to the vendor
  • Developed Error routines to constantly push the bad data in front of the Business users to correct the data in source systems
Environment: Informatica PowerCenter 8.1.1, SQL Server 2008/2005, Business Objects XI, TFS, Quality Center, Windows XP, Microsoft Visio, Python2.5, Autosys

Billing Reconciliation Project:
To be PCI Compliant, Electric insurance outsourced the credit card and ACH (pre-authorized check) customer payment transactions to a vendor TrustCommerce/5th3rd (TC).As a result, NO credit card or bank account data will be stored within Electric\'s network, with the exception of the EFT forms in the secured FileNet Doc Class.Electric must change payment processing within its current applications to use the TC payment gateway while ensuring the customer\'s experience is not negatively impacted during the payment process. The payment transaction will be sent to TC via their TCLink API (Application Programming Interface) and TCLink will return the transaction (i.e., approved, decline, baddata or error) to Electric. Controls and reports must maximize automation to ensure daily reconciliations are maintained (Billing/Finance,TC, 5th3rd). Electric will query the TC Vault database on a nightly basis using the API Query function. New data warehouse and a BO universe and report(s) are needed to support the reporting requirements

Responsibilities:

  • Designed and developed an ETL solution that extracts data various payment systems across the enterprise and reconcile the data with the vendor(Trust commerce) processing system.
  • Developed an API to extract the from the vendor site on nightly basis for a given time period
  • Developed an ETL process to extract the data from sql and load it to datacom using the native SQL ODBC that is further consumed in the mainframe Billing systems
  • Designed and developed Business Objects universe to provide historical payment variances and also reports that contain daily payment variances.
  • Designed and Developed purge routines for the datawarehouse and operational systems.
  • Developed control mechanisms to reconcile the dollar amounts and row counts at consolidated and detail levels
  • Developed the Datamodel using Siebel power designer
  • Helped QA in validating the test results.
  • Developed Run books for production support
  • Estimated the Database space needed based on the data volume and the data structures.
Environment: Informatica PowerCenter 8.1.1, SQL Server 2008/2005, Business Objects XI, TFS, Quality Center, Windows XP, Microsoft Visio, Python2.5, Autosys

Check Reconciliation Project:
Electric insurance was lacking a consolidated view of the checks processed for its commercial and personal lines businesses. The
checks were processed and printed with an outside vendor CGI. Daily controls and reconciliation were critical to ensure data
accuracy with-in the company and the vendor.

Responsibilities:

  • Worked closely with the Business to understand the reporting requirements
  • Designed and Developed qlik view reports to reconcile the data between vendor CGI and target systems
  • Several reports were created to reconcile the check and document/attachment data
  • Designed and Developed Informatica processes to extract data from internal check issue systems and Vendor
Environment: Informatica PowerCenter 8.1.1, SQL Server 2008/2005, QlikView, TFS, Quality Center, Windows XP, Microsoft Visio, Python2.5, Autosys

Confidential:
Confidential was implemented to solve or enhance the following concerns from each of the following departments
across the enterprise :
Business Profitability- To Assess traditional measures such as expense ratios and loss ratios.
Business Quality-Analyze new and existing business for known quality measures that drive profitability.
Call Center Productivity- Measure Representative performance and service levels.
Claims Operational Analysis-Analyze claims processing, payments and other operational activities.
Channel Profitability Analysis-Assess profitability by channel.
Cross Product Analysis- Determine characteristics and behaviors of existing and potential multi-line customers.
Cross Product Marketing-Market the alternative products to existing policyholders.
Overall Sales Analysis- Measure the efficacy of the new business process.
Pricing / Actuarial Analyze-Assessing risk exposure and reflecting it in the pricing of policies, reserving practices
and catastrophe management.
Regulatory Reporting- Provide accurate information in a variety of formats to a set of regulatory and industry-governing bodies.

Responsibilities:

  • Developed Control mechanisms to ensure data accuracy between EDH and the Mart.
  • Developed a generic subroutine that consolidates name and address data for further standardization using

Data quality tools (DQ-BOXI)

  • Understand the Business Requirements, Prepare the Source –Target, Document transformation rules
  • Worked closely with the project management teams to come with a reasonable time schedule for projects.
  • Estimated the Database space needed based on the data volume and the data structures.
  • Developed complex stored procedures to automatically handle updates for future sources
  • Developed complex ETL to dynamically build queries based on the rules provided by the end user
  • Designed and Developed Business objects universes to slice and dice the quote/policy metrics across different time periods
  • Built various daily/monthly/quaterly reports to cater department specific needs.
  • Developed the Datamodel using Siebel power designer
  • Developed Data quality routines to standardize Name and Address across different product lines to enable 360 degree view of the customer.
  • Developed Error routines to constantly push the bad data in front of the Business users to correct the data in source systems
  • Designed and Developed Business Objects universe to cater the reporting needs.
  • Capture of metadata (definitions and business rules pertaining to the data) in a repository. This will help ensure consistency of processing as well as consistent meaning of the data across the enterprise.
  • Helped QA in validating the test results.
  • Created Run books for production support
Environment: Informatica PowerCenter 8.1.1, SQL Server 2008/2005, Business Objects XI 11.2, DataQuality Architect 11.7, TFS, Quality Center, Windows XP, Microsoft Visio, Python2.5, Autosys, Informatica Power Exchange

Confidential:
The goal of Confidential project, the first phase of a Corporate Profitability Initiative, is to enable more flexible access to the General Ledger (GL) and Accounts Payable (AP) data using a contemporary analysis and reporting tool. The Lawson Desktop Application that we use today, is an application geared more to General Ledger maintenance than leading edge data analysis and reporting writing does not allow us to take analysis to a new level. This robust reporting and analysis tool would be a first step in enabling the Financial Planning and Analysis (FPA) group to better provide relevant and timely analysis to Cost Center managers for decision-making purposes. A second benefit would be the implementation of the core architecture to be leveraged in the second phase of the initiative –fully accounted income Personal Lines statements by product state and region.

Responsibilities:

  • Lead a team of junior developers/consultants to help supplement the development work for the project.
  • Translate the Business Requirements into Technical requirements, Prepare the Source -Target, Document transformation rules
  • Developed a MD5 approach to Capture data changes and standardize the code to use it across the EDH and Datamarts
  • Developed a common strategy to source a consistent set of records from a 24/7 live update system. This helps populating consistent data in related transaction tables.
  • Developed mappings to populate the dimensional models for Budget, Cash Book and General Ledger
  • Developed controls to ensure data movement between source and targets systems
  • Worked closely with the financial analysts to to digest the business requirements
  • Developed slowly changing dimensions to capture history
Environment: Informatica PowerCenter 8.1/7.1.2, SQL Server 2000/2005, VSS, Test Director, Windows XP, Microsoft Visio, Python2.5, Autosys 4.1.5,Informatica Power Exchange

Replatforming Project: This project involved migrating the existing informatica 7.1 and SQL 2000, Hardware into
Informatica 8.0 and MS SQLSERVER 2005.

Responsibilities:

  • Parsed out the Informatica workflows to a grain of 1:1 workflow: session, using python scripts and manipulated the workflow. XML files to configure the connections.
  • Developed Python Scripts to parameterize the sessions parameters during the decomposition.
  • Developed checksum SQL Scripts to Validate the old datawarehouse (OASIS) verses the new datawarehouse (INSIGHT)
  • Identified the drawbacks in migrating from informatica 7 to 8 and fix the bad maps using Informatica Technical Support
  • Converted the DTS packages into informatica maps to execute them as post/pre source/target load
  • Used Informatica Power exchange to extract data from from one of the EIC s operational system called DataCom.
  • Develop Python Scripts to generate the JIL file to create the Autosys Jobs
  • Imported the JIL file into Autosys and enforced the job dependencies. This also included CA7 dependencies
  • Fixed corrupted maps during migration, which involved unconnected sequence generators, Ltrim, Rtrims functions in Lkps and source qualifiers.
  • Extensively worked with repository tables to update All the sequence generators by querying the max from the database tables all at once
  • Created automated documentation by querying the metadata from the informatica repositories.
Environment: Informatica PowerCenter 8.0/7.1.2, SQL Server 2000/2005, VSS, Test Director, Windows XP, Microsoft Visio, Python2.5, Autosys 4.1.5,Informatica Power Exchange

Close Rate By Rep:
The goal of this project is to accurately depict the “real” close rate for individual representatives within the sales department. The Close rate will need to be determined by factoring in a quote and a ½ or full bind for every rep who participated in the policy sale. One close rate should be displayed for each rep (2 are currently displayed b/c of different initials). By identifying the Improvement opportunities, leadership will be able to pinpoint problem areas and improve overall sales performance.

Responsibilities:

  • Analyze and validate the data on the source systems.
  • Translate the business requirements into technical specifications.
  • Design and Develop an ETL strategy to transfer the data from source ,ODS to Datamarts
  • Design and develop an error handling process to capture bad data and present it to the business system experts
  • Develop Mapplets for code reusability and consistency
  • Write SQL scripts to compare and find the missing data in ODS and EDW.
  • Work on fixes and enhancements on the workers compensation project on commercial lines
  • Work on resolving the performance issues in the existing maps.
  • Use SQL server Agent to schedule the Informatica and SQL jobs
  • Covert SQL jobs to DTS jobs to avoid linked server issues while calling a database task from SQLSERVER agent
Environment: Informatica PowerMart 7.1, SQL Server 2000, PVCS, Test Director, Windows 2000/XP, Microsoft Visio.

GE payroll deduct Project:
Deliver a convenient, authorized sign up mechanism, at policy issue, or post-issue. Enable payroll deduct capability in pilot mode with EIC employees in April 2009, and with GE by 2nd Quarter 2009. Provide capability to deploy to other affinity groups. Eliminate installment fees & provide Payroll Deduct policy discount. Provide smooth transition between payroll deduct and bill
direct. Proactively communicate to EE. Deliver timely, accurate transaction files to the affinity group per their requested schedules. Deliver secure, compliant and efficient business processes that facilitate growth of the GE channel, including timely & accurate reconciliation of payment files. Enable growth of other affinity groups.

Responsibilities:

  • Developed ETL Processes to extract data from GE on a weekly basis.
  • Implemented slowly changing dimensions to capture history

Confidential, Boston, MA July '04 – Aug ‘05
Role : Sr Informatica Developer
Manulife Financials Manulife Private Account (MPA) is a multi-style, separately managed account, in which the customer\'s assets are managed by well-respected investment firms. It offers style diversification, customization opportunities, tax management, and a unique overlay process that limits over concentration in an investment style and duplication of securities. Decommissioning oracle and implementing the same in SQLServer using informatica was the goal of the project.

Responsibilities:

  • Translated existing PL/SQL code into technical specifications for developing mappings using informatica.
  • Defined and implemented ETL design, development and implementation standards and procedures to be applied to all the development projects
  • Developed data transfer strategy to extract and integrate data from MS SQL SERVER, FLATFILES into ODS and Datamart.
  • Implemented slowly changing dimensions according to the requirements.
  • Developed complex mappings/sessions using normalizer, aggregator, lookup, update strategy, router and joiner transformations for data loading.
  • Created reusable mapplet/session/command with embedded business logic.
  • Involved in Generation and administration of system test plans and test cases
  • Developed test scripts to compare the data statistics/differences in oracle and SQL
  • Extensively used command, control, event wait, decision tasks to control the logical structure of the program with in the workflow.
  • Extensively used PVCS to check in Move forms, paramater files, and database scripts required for migration.
  • Partitioned sessions, modify cache/buffers, and tune transformations for better performance.
  • Migrated Informatica sessions from Development to certification and Integration and setting up the directories for Parameter files, Logs and Bad files Developed VB scripts to deploy the directory structure needed while moving across the environments.
  • Performed unit testing by comparing the set of data in oracle and SQLServer systems.
  • Provided production support for the applications through the warranty period.
  • Extensively used Test Director to log/close/reassign defects identified by the Business system analysts/Testers.
Environment: Informatica PowerCenter 6.2, SQL Server 2000, Oracle 9i, PVCS, Test Director, TOAD 7.5, Windows 2000/XP, Microsoft Visio.

Confidential, Louisville, CO May '03 – July ‘04
Role: ETL Developer/Analyst
StorageTek requires the ability to perform detailed analysis on actual revenue and cost of revenue to better respond to the changes in the market place. Better information will enable StorageTek to quickly identify trends and help drive their strategic objectives of profitable revenue growth and unleashing trapped profitability. In support of this, eight major categories of information needs have identified: Product sales and cost of sales analysis, service revenue and cost revenue analysis, sales order analysis, order fulfillment, operating expenses, general ledger, supply chain management, plan and forecast.

Responsibilities:

  • Interacted with business analysts and translated business requirements into technical specifications.
  • Assisted in designing the data models for data warehouse and datamarts.
  • Involved in designing ETL architecture, naming standards for folders/transformations/mappings/sessions/parm_files/scripts across the repository.
  • Created shared folders, local and global shortcuts.
  • Developed data transfer strategy to extract and integrate data from SAP, CLARIFY ACES, PRO, ORACLE, EXCEL FILES, MS SQL SERVER, FLATFILES, SIEBEL into data warehouse.
  • Developed complex mappings/sessions using Informatica power center for data loading.
  • Extensively used ERP source qualifier, normalizer, aggregator, lookup, update strategy, router, joiner transformation.
  • Implemented slowly changing dimensions according to the requirements.
  • Partitioned sessions, modify cache/buffers, tune transformations for better performance.
  • Migrated Informatica sessions from Development to QA and from QA to production and setting up the UNIX directories for Parameter files, Logs and Bad files
  • Performed unit testing
  • Incrementally extracted data using shell scripts and scheduled those using AUTOSYS.

Environment: Informatica PowerCenter 6.1/5.1.2, Oracle 9i/8i, SAP, DB2, AutoSys 3.5, TOAD 7.5, DataFlux PowerStudio 6.0, SQL Server 2000, UNIX Solaris 7.0

Confidential, Radnor, PA Jun ‘02 – Jan ‘03
Role: ETL developer

The project is developed for manufacturing, marketing divisions to provide periodical non-retail sales information from Source Non-Retail (SNR) for the Abelcet anti-fungal market at Zip Code level and manage inventory for manufacturing.

Responsibilities:

  • Involved in development of Informatica mappings and also tuned for better performance.
  • Administered the Informatica repository by creating and managing user profiles and metadata
  • Created Informatica mappings with stored procedures to build business rules to load data.
  • Various transformations (Source qualifier, Aggregators, Connected &unconnected lookups, Filters &Sequence) were used to handle situations depending upon the requirement
  • Called stored procedures perform database operations on post-session and pre session commands
  • Written Parameter file for batch processing from different repositories.
  • Created partitions to concurrently load the data in to sources
  • Loaded bad data using reject loader utility
  • Involved in writing shell scripts, automating the batch jobs using crontabs.
  • Performed Unit Testing and tuned for better performance.
  • Written unix shell Scripts for getting data from all the systems to Data Warehousing system.
  • Built new dimensions in Universes to support the new reporting requirements of business users.
  • Used SQL tools like TOAD to run SQL queries and validate the data pulled in BO reports.
  • Created the reports using Business Objects functionality like Combined Queries, Slice and Dice, Drill Down, Functions, Cross Tab, Master Detail and Formulae etc.

Environment: Informatica PowerCenter 5.1, Business Objects 5.1, Oracle 8i, PL/SQL, SQL 2000, windows NT, Sun solaris 7.0, DB2

Confidential, Dept of Information Systems, Shippensburg, PA. Aug ‘00 – May ‘02
Role: Computer Technology Assistant

Responsibilities include teach and evaluate students, design course labs and projects for programming in Sql, Plsql, Asp, Html, Linux and Unix and configure, install and troubleshoot software/hardware

Confidential, Harrisburg, PA. May '01 – Oct ‘01
Role: Web Programmer

Responsibilities: Developed and maintained scripting applications (VBscript) in Active server pages for the company's website, imported data from oracle to MS SQL Server through data transformation services

Education:
  • Master of Science in Computer Science
  • Bachelor of Science in Engineering

We'd love your feedback!