Computer Programmer Resume
Pittsburgh, PA
PROFESSIONAL SUMMARY:
- Nine plus (9 Years and 9 Months) years of IT experience in the Analysis, Design, Development, Testing, and Implementation of business application systems for Banking and Financial Services (US Banking, Risk Management & Finance) Sector. I have almost 4 Year and 2 Months of US experience.
- Strong experience in the Analysis, Development, User Acceptance Testing, and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, OLAP, BI, Client/Server applications.
- Strong Data Warehousing ETL experience of using Informatica 10.2/9.5/9.1/8.6/8.5 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools like Informatica Server, Repository Server manager.
- Expertise in Data Warehouse/Data mart, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modelling, Effort Estimation, ETL Design, development, System testing, Implementation, and production support.
- Involved in building the ETL architecture & data flow using Microsoft Visio.
- Strong experience in Extraction, Transformation and Loading (ETL) and extensive testing data from various sources into Data Warehouses and Data Marts using Informatica Power Center 10.2/9.6/9.5/9.1/8.6/8.5 (Repository Manager, Designer, Workflow Manager, Workflow Monitor) and Power Exchange as ETL tools on Mainframe, Oracle, Teradata and SQL Server Databases.
- Expertise in working with relational databases such as Oracle 12c/11g/10g/9i, SQL Server 2016/2012/2008 and Teradata.
- Experience in Agile Methodology and extensive use of Jira to Create and Implement the Stories and releases.
- Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using Oracle PL/SQL.
- Worked on MS SQL Server queries and hands on procedures.
- Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
- Experience in all phases of Data warehouse development from requirements gathering for the data warehouse to develop the code, Unit Testing and Documenting.
- Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
- Experience in using Automation Scheduling tools like CA7 and Control-M.
- Hands-on experience across all stages of Software Development Life Cycle (SDLC) including business requirement analysis, design documentation, data mapping, build, unit testing, systems integration, user acceptance testing and defect fixing.
- Semantic Code checking using Harvest/uDeploy version control tool.
- Conducting unit testing by creating and executing of test cases to ensure successful load of data. Review on components and test results prepared by other team members.
- Monitored Job runs through CA7, IDASH & Informatica Monitor Tools.
- Resolving Service Now Incidents and application issues in production environment.
- Excellent interpersonal and communication skills and is experienced in working with senior management, business users and developers across multiple disciplines.
- Daily Stand up with Scrum Master and Weekly status meeting with Business Owner & project manager.
- Managed an environment of trust and cooperation through an open exchange of ideas.
TEnHNICAL SKILLS:
Databases: Oracle 12c/11g/10g/9i/8x, MS SQL Server 2012/2008, Teradata 16.2
Operating Systems: UNIX, LINUX, Windows 2010/2007/XP, Mainframes
Database tools: TOAD, PL/SQL Developer, SQL*Plus, SQL Developer
ETL tools: Informatica PowerCenter 10.2, 9.6, 9.5.1 & 8.6.1, Informatica Power Exchange, NDM
Reporting Tool: OBIEE 12c/11g/10g, Business Objects
Data Modeling Tools: Microsoft Visio
Other Supporting Tools: Harvest, IBM uDeploy, ALM 11.0, PUTTY, Revelus, Atlassian Jira, Acorn PA 5g, Acorn EPS,Service NOW
Scheduling Tools: CA7, Control-M
Languages: SQL, PL/SQL, XML, Linux Shell Scripting
Domain Knowledge: Banking & Financial Services
PROFESSIONAL EXPERIENCE:
Confidential, Pittsburgh, PA
Computer Programmer (Technical Lead / Sr. ETL/Informatica Developer)
Environment: Oracle 12c, Informatica Power Center 10.2.1, Informatica Power Exchange, uDeploy, Artifactory, LINUX, PL/SQL, SQL Server 2016/2012, Putty/WinSCP, Atlassian Jira, Microsoft SQL Server Management Studio, Toad, Teradata Studio, Shell Scripts, Service NOW, Acorn PA5g, Mainframes, CA7, Windows 2010, OBIEE 12c
Responsibilities:
- Working with business partners to support critical issues/requirements and make sure that identified issues are tracked, reported on and resolved in a timely manner.
- Involved in building the ETL architecture for SSM and Source to Target mapping to load data into Oracle database.
- Involving in requirement analysis, understanding of business requirements, identifying the flow of information and analysing the existing systems.
- Identifying strengths and weaknesses in existing processes to enhance existing ETL processes to meet evolving requirements.
- Created stored procedure for complex Reconciliation for GL process.
- Creating and verifying Informatica Power Center data mappings/sessions/workflows for SSM.
- Worked on Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Stored Procedure, Sequence Generator, Filter, Sorter, Web Services Consumer and Source Qualifier.
- Developed Database procedure as a part of value add to restate the historical data in FACT and Dimensional tables for volume, Expense and Rates.
- Created Shell Script to NDM the source files from Business SharePoint server(wss) to Informatica Linux Server.
- Participating in test-driven, peer review, agile development methodology and scrum Meeting.
- Preforming unit testing on modified SSM components.
- Experienced in analysing query logs and explain paths. Identify performance bottlenecks and optimize respective databases and processes for performance improvement.
- Defining load strategy and created schedules for existing and new processes to meet business SLA.
- Created Informatica Job to pull OFS rates report from OBIEE Dashboard using Web Services Consumer transformation and sent to Business Users as part of email communication.
- Sematic Code Checking using Udeploy/Harvest/Artifactory version control tools.
- Peer review on the ETL components, test results and code deliverables.
- Extensive use of Acorn PA 5g to calculate expenses and rates.
ACFR RTB Support ETL Centralisation
Confidential, Pittsburgh, PA
Environment: Oracle 12c/11g, Informatica Power Center 10.2.1/9.6, Informatica Power Exchange, uDeploy, Harvest, Artifactory, LINUX, PL/SQL, SQL Server 2012, Putty/WinSCP, Atlassian Jira, Microsoft SQL Server Management Studio, Toad, Teradata Studio, Shell Scripts, Service NOW, Acorn PA5g, Mainframes, CA7, Windows 2010, OBIEE 12c/11g
Responsibilities:
- Monitored Job runs through CA7, IDASH & ETL Tools.
- Monitoring on Database, Linux and Windows patching.
- Worked on decommissioning of old servers.
- Health Check-up of all the applications in ACFR Run the Bank area.
- Extraction, Transformation and Loading (ETL) of data by using Informatica Power Centre.
- Understanding existing business model and customer requirements to support the implemented code in Production.
- Understanding the functionality of the existing systems.
- Participation in knowledge sharing sessions with application teams to understand the flow of the data.
- Finding the risk and issue during the requirement and design phase.
- Resolving Incidents and application issues in production environment.
- Weekly status meeting with project manager.
Confidential, Pittsburgh, PA
Computer Programmer
Environment: Oracle 11g, Informatica Power Center 9.6, Informatica Power Exchange, Harvest, UNIX, PL/SQL, SQL Server 2008, Putty/WinSCP, Atlassian Jira, Microsoft SQL Server Management Studio, Toad, Teradata Studio, Shell Scripts, Service NOW, Acorn EPS, Mainframes, CA7, Windows 2007, OBIEE 11g
Responsibilities:
- Working with business partners to support critical issues/requirements and make sure that identified issues are tracked, reported on and resolved in a timely manner.
- Designed Process flow to Generate Volumes for the 16 Operations Service Groupings as a part of Phase I objective.
- Involving in requirement analysis, understanding of business requirements, identifying the flow of information and implement the design to ETL End-End process.
- Created stored procedure for reconciliation for Org Hierarchy process.
- Created Complex ETL to generate the GL amount using the Rolled up and Actual GL's and Cost Centres.
- Created CA7 DOC5, DOC40 documents for Job scheduling.
- Creating and verifying Informatica Power Center data mappings/sessions/workflows for SSM.
- Worked on Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Stored Procedure, Sequence Generator, Filter, Sorter, Web Services Consumer and Source Qualifier.
- Created Shell Script to call the Oracle wallet to generate the information from Volume table and send across over email to business users.
- Participating in test-driven, peer review and development team meeting.
- Preforming unit testing on newly created or modified SSM components.
- Created Informatica Job to pull SSM Volume report from OBIEE Dashboard using Web Services Consumer transformation and sent to Business Users as part of email communication.
- Peer review on the ETL components, test results and code deliverables.
- Extensive use of Acorn EPS to calculate expenses and rates.
- Resolved the issue for Organizational Hierarchy w.r.t Cost centre allocations in production environment.
Allowance for loan and lease losses
Confidential, Pittsburgh, PA,
Environment: Oracle 10g, Informatica Power Center 9.5/8.6/8.5, Informatica Power Exchange, Harvest, UNIX, PL/SQL, SQL Server 2008, Putty/WinSCP, Microsoft SQL Server Management Studio, Toad, Teradata Studio, Shell Scripts, Service NOW, Mainframes, CA7, Windows 2007/XP, OBIEE 10g
Responsibilities:
- Designed Process flow to Generate Reserve amount for Consumer, Commercial lending systems.
- Created Oracle stored procedure to implement Roll rate methodology to calculate the inherent losses.
- Created Complex ETL to segregate the portfolios for loans and lease account based on delinquency period.
- Worked with business partners to support critical issues/requirements and make sure that identified issues are tracked, reported on and resolved in a timely manner.
- Created CA7 DOC5, DOC40 documents for ALLL Job scheduling.
- Creating and verifying Informatica Power Center data mappings/sessions/workflows for ALLL.
- Created Shell Script to NDM Source files from Windows server to Linux server for ETL processing.
- Participating in test-driven, peer review and development team meeting.
- Preforming unit testing on newly created or modified ALLL components.
- Peer review on the ETL components, test results and code deliverables.
- Extensive use of OFSAA Revelus DEFQ reports to generate allowance and losses.
- Defect prevention and causal analysis activities.
- Resolved the issue for Student lending portfolio loans w.r.t delinquency mismatch in production environment.
- Weekly status meeting with project manager.
Informatica Development
Confidential, Minneapolis, MN,
Environment: Oracle 9i, Informatica Power Center 9.1/8.6/8.5, Informatica Power Exchange, Harvest, UNIX, PL/SQL, SQL Server 2008, Putty/WinSCP, Microsoft SQL Server Management Studio, Toad, Teradata Studio, Shell Scripts, Service NOW, Mainframes, Control-M, Windows XP, OBIEE 9i
Responsibilities:
- Understanding existing business model and Business requirements for archival of the data.
- Understanding the functionality of the existing Stage and Reporting Layer systems.
- Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center 9.1.
- Participation in knowledge sharing sessions with the application teams to understand the flow and frequency of the data.
- Finding the risk and issue, bottlenecks during analysis phase.
- Generated Reusable Transformations, mapping for table archival with same structure in the stage layer.
- Designing the integration of the ETL Stage and Reporting layers with concept of monthly and Daily archival process.
- Designing the ETL low level design document and unit test cases.
- Created Procedure for to archive data based on for Daily and Monthly frequency of data.
- Creating Oracle scripts to use in ETL sessions for calling the procedure to archive the data.
- Conducting unit testing.
- Used various Informatica transformations like Expression, Filter, Aggregator, Rank and Router to load better and consistent data into the targets.
- Created and Monitored sessions using Informatica Workflow Manager and Workflow Monitor.
Confidential
Computer Programmer (ETL/Informatica Developer)
Responsibilities:
- Understanding the functionality of Peoplesoft HCM engine which generates consumer address.
- Gathering Business requirements for design layout of FEDS data process.
- Create ETL to extract and identify the Employees and Contactors through HRMS system.
- Created Control-M jobs to trigger ETL workflows.
- Finding the risk and issue, bottlenecks during analysis phase.
- Used Designer to create source definitions, designed targets, created mappings and developed transformations.
- Conducting code standard check and unit testing.
- Developed several reusable transformations and mapplets that were used in other mappings.
- Created the ETL process to verify the Address issued for each account/loan.
- Created event trigger to report any fraud.
- Provided the post implementation support.
- Resolved the issue with the fraud analysis engine misinterpretation of accounts.
Confidential
Computer Programmer (ETL/Informatica Developer)
Environment: Oracle 9i, Informatica Power Center 8.5, Informatica Power Exchange, Harvest, UNIX, PL/SQL, SQL Server 2008, Putty/WinSCP, Microsoft SQL Server Management Studio, Toad, Shell Scripts, Service NOW, Mainframes, Control-M, Windows XP, OBIEE 9i
Responsibilities:
- Understanding the functionality of Existing SRT ETL which generates data from different portfolios.
- Minimized the ETL load process from 3 hour to 1.5 hours by tuning/splitting the mappings based on same structure SRT datasets.
- Used Designer to create source definitions, designed targets, created mappings and developed transformations.
- Developed several reusable transformations and mapplets that were used in other SRT related mappings.
- Created ETL to extract and identify the data with condition of Maker-Checker marked.
- Developed the ETL to verify new set of Security keys in Stage and communicate business through email for re-validations before main process starts.Providing the post implementation support.
- Conducting code standard check and unit testing.
- Created Control-M jobs to trigger SRT ETL workflows.