- Informatica PowerCenter professional with over 3 years’ experience
- Solid experience with, and understanding of ETL processes
- Experience developing SCD Type 1 and Type 2 mapping in Data warehouse.
- Develop mappings to Load fact tables.
- Develop Informatica PC mappings using transformations like Lookup, Aggregator, Joiner, Update Strategy, mapplets and re - useable transformation
- Follow ETL naming standards and best practices to promote re-usability of code using mapplets and re-useable transforms and shortcuts
- Provide analysis of data defects to respond to concerns of data quality
- Work with Informatica mappings to fix data defects and enhance existing functionality.
- Monitor jobs in production and fix any production issues that may arise
- Work with Informatica admin to migrate code from DEV to QA and Prod.
- Develop data extracts from the Data warehouse to be sent to third party users.
- Solid experience working with Oracle, SQL Server and DB2 data bases
- Have solid hands on skills with PL/SQL and SQL
- Experience writing Shell Scripts
- Worked within Agile environment
Languages: PL/SQL, SQL, Shell script
Business Intelligence: Informatica 9.x
DBMS: Oracle 11g, MS Access, DB2, MS SQL Server
OS/Environment: UNIX, Windows
Tools: Toad, SQL Developer, IBM Tivoli
Confidential, Prairie, WI
Environment: Windows, UNIX, Oracle 11g, SQL Server on AWS/RDS, DB2, SQL, PL/SQL, Informatica 9.x
- Attending Source to Target Review meetings with Data Molder and Business Analyst.
- Creating Staging/Support tables required for the Delta data capture.
- Build/Develop Informatica Mappings based on the Source to Target Mapping Documents.
- Creating Unit Test Case Documents and test code.
- Creating Deployment Groups and work with Deployment Manager to Deploy code to UAT and Production.
- Fixing mapping defects
- Mapping enhancement based on new requirement on the existing code.
- Monitoring jobs and providing production support, emergency fix for job failures.
- Performance tuning long running Informatica jobs.
- Attending Sprint Planning meeting and providing estimate on the work to be done.
- Attending Daily standup meetings to provide status on the task assigned.
- Meetings with Cognos and SSRS Reporting Team to explain table structures and how to join tables to extract data for reporting.
- Generating data files using SQL queries for Ad-hoc requests.
- Write queries on SQL Server on RDS in AWS.
- Use SQL Server Cloud Connector to write to SQL Server on AWS with the help of Informatica
Confidential, Abbott Park, IL
Environment: Windows, UNIX, Oracle 11g, SQL Server, DB2, SQL, PL/SQL, Informatica 9.x
- Generated source to target mapping document from LSH and review the Mapping document for errors or inaccuracies
- Discussed issues/errors with the Mapping document with the BA and see clarification and request updates.
- Analyzed source system tables based on the Mapping document and create ETL specs for extracting of data from the source system
- Created target tables which contained data from the source system loaded via ETL.
- Created ETL Informatica mappings to load data from Source System to target tables in LSH
- Developed and deployed Oracle PL/SQL code to move data from source systems to LSH.
- Conducted root cause analysis when data defect is raised in production.
- Used LSH to produce data files for consumption in SAS
- Worked with Test Analyst to resolve any open issues with the data being loaded in LSH.
- Created Unit Test cases and conduct Unit Testing.
- Created labels and deployment groups to enable migration of code
- Analyzed production Data Quality tickets and triage them to work on the most critical projects.
- Promoted code re-usability by using mapplets and re-useable transformations.
- Developed ETL code to check data quality and log errors
- Coordinated deployment in database with DBA, Informatica Admin and Unix Admin
- Provided production support when on call for issues affecting data load
- Fine-tuned mappings when they take a lot of time to complete.
- Attended status meeting and provide status of work Items and projected end dates.
- Discussed data Model questions/issues with the Data Modeler and resolve them.
Confidential, Chicago, IL
Environment: Windows, UNIX, Oracle 11g, DB2, SQL, PL/SQL, Informatica 9.x-
- Developed source to target mapping document for new mappings that need to be created
- Analyzed existing mappings and create source to target mapping documents for them
- Worked with the Business analyst to understand requirements and document them
- Developed Informatica mapping to populate target table tables that will be used for conversion.
- Unit testing of mappings to determine if they meet requirements
- Worked with testing team to explain the requirements to them and help them execute workflow as required in QA environment for them
- Created and schedule jobs in job scheduler and integration test them.
- Worked with Informatica admin to migrate code to upper environment.
- Attended code review meetings and explain code functionality to reviewer and get sign-off
- Analyzed defects of data in production and propose plan to fix them.
- Created production deployment plan and get it verified by Team lead.
- Worked with Informatica Admin and DBA to deploy code in production
- Monitored job executions in production to ensure that load is progressing as planned and troubleshoot any issues that may arise.
- Worked within Agile environment
Confidential, Ann Arbor, Michigan
Environment: Windows, UNIX, Oracle 11g, SQL, PL/SQL, Informatica 9.x
- Monitored jobs in production and fix and restart jobs in production.
- Developed SCD Type I and II mappings and Fact table. Loaded data to data warehouse
- Worked on Snowflake and Star schemas
- Worked on tickets requesting data extracts for customers.
- Analyzed data issues in data extracts sent to customers and provide resolution for the same
- Provided ad-hoc data request by writing SQL queries
- Provided status update during daily standup.
- Developed Unix shell scripts to invoke SQL scripts
- Created jobs and job streams in Tivoli
- Analysis of data in the ODS/DWH to determine feasibility of implementation of requirements in the project
- Create Source to Target mapping Document from the requirement document
- Attend meetings to understand requirements.
- Work with the Data Modeler to get the table structures created.
- Develop Informatica mappings, workflows per requirements
- Conduct code review of the mappings and workflows.
- Developed UNIX script for invoking Informatica workflows.
- Moved code to migration folder and work with the INFA admin for code migration to QA and Prod.
- Created Jobs and Job Streams on Cisco Tidal scheduler.
- Provided support to QA team and resolved tickets issues.
- Provided 90-day warranty support for CBT code before handoff to Operations and Maintenance team.
- Answered questions raised by the customer with regards to data on the file.
Environment: Windows, UNIX, Oracle 10g PL/SQL
- Developed code to segment population based on demographics to target VAS messaging
- Developed PL/SQL code to retrieve metrics on popularity of targeted VAS.
- Developed PL/SQL code to embed the same is proprietary scripting tool to retrieve data for customer contests (an example of VAS)
- Used proprietary scripting tool to setup VAS for Customers
- Loaded FTP data arriving from Confidential to transaction tables for further processing.
- Used SQL Loader to do adhoc data load’s to table.
- Migrated code from Development to QA.
- Ensured timely delivery of code.
- Created status report and attending status calls.
- Analyzed production support issues with the aim to prevent them from reoccurring and resolved them
- Scheduled jobs in Tivoli by creating jobs and job streams