Informatica Developer Resume Profile
West Chester, PA
Professional Summary:
- 6 years of experience in Data Warehousing/Integration using ETL Tool Informatica Power Center 9.x/8.x/7.x across the business areas such as Insurance, Healthcare, Banking and Financial industries.
- Involved in full Life Cycle Development including System Analysis, Design, Data Modeling, Implementation and Support of various applications in Data Warehousing, and OLAP applications.
- Extensive experience in using Informatica Power Center 9.x/8.x/7.x to carry out the Extraction, Transformation and Loading process as well as Administering in creating domains, repositories and folder.
- Have clear understanding of Data Warehousing and BI concepts with emphasis on ETL and life cycle development using Power Center, Repository Manager, Designer, Workflow Manager and Workflow Monitor.
- Data Modeler with strong Conceptual, Logical Data Modeling skills and experience in requirements gathering, source to target mapping, writing functional specifications, queries.
- Extensively worked on Dimensional Modeling, Data Migration, Data Cleansing, and Data staging for operational sources using ETL and data mining features for data warehouses.
- Knowledge of complete software development life cycle including Requirement Gathering, Requirement Analysis, Cost Estimation, Project Management, Design, Development, Implementation and Testing.
- Expertise in OLAP/OLTP System Study, Analysis and ER Modeling, developing database Schemas like Star Schema, Snowflake Schema, Conforming Dimensions and Slowly Changing Dimensions used in relational, dimensional and multidimensional modeling.
- Strong experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected, Static and Dynamic lookups, SQL, Stored Procedure, Router, Filter, Expression, Aggregator, Joiner, Update Strategy.
- Good understanding of relational database management systems like Oracle, Teradata, DB2, SQL Server and worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems.
- Actively involved in Performance Tuning and have working knowledge on Java/J2ee and claims.
- Experience in implementing update strategies and incremental loads.
- Usage of Informatica mapping variables/parameters and session variables strong in UNIX Shell scripting and Perl Scripting. Scripts using PMCMD utility and scheduled ETL load using utilities like CRON tab, Control-M and Autosys.
- Expertise in building Enterprise Data Warehouses EDW , Operational Data Store ODS , Data Marts and Decision Support Systems DSS using Multidimensional and Dimensional modeling Star and Snowflake schema Concepts.
- good knowledge in Oracle 11g/10g/9i, IBM DB2 9.x/8.x, Teradata, Sybase, SQL Server 2012/2008/2005, SQL, PL/SQL Stored procedures, functions, and exception handling using Toad and Mainframe technologies in COBOL, CICS, JCL, VSAM, File aid, File manager, Expeditor.
- Responsible for interacting with business partners to identify information needs and business requirements for Reports
- Extensive database experience and highly skilled in SQL Server, Oracle, DB2, Sybase, XML, Files, Flat Files, MS Access.
- Excellent communication, interpersonal skills, enthusiastic, knowledge-hungry self-starter, eager to meet challenges and quickly assimilate latest technologies concepts and ideas.
Technical Skills:
- ETL Tools: Informatica Power Center 9.x/8.x/7.x, PowerMart, Power Exchange, Crystal
- Reports
- Languages: C/C , JAVA, HTML, DHTML, XML, VB, SQL, PL/SQL.
- RDBMS: Oracle 11g/10g/9i/8i/7x, DB2 V9.x/8.x, SQL Server 2012/2008/2005/,
- MS Access.
- Database related
- utilities/tools: Mainframes DB2, VSAM and Sequential Files , File Aid, TOAD, Teradata SQL Assistant, SQL Loader, SQL Plus, SQL, PL/SQL, Stored procedures, Functions, Procedures, Packages, Triggers, Database links, Exception Handling, Import/Export Manager, Oracle Warehouse Builder.
- Scripts: UNIX Shell Scripting, Java Scripts.
- Operating Systems: Windows 7/XP/Vista/, UNIX and Red Hat Linux.
Confidential
Informatica Developer
Confidential is the largest broadcasting and Confidential in the world by revenue. Comcast has been moving towards including Contract data in the DW for use by the appropriate users in the enterprise that need to report, analyze, create models, trends and submit to the operational parties. This project will bring Business critical Comcast departments that need Contract data in order to create Finance Exhibit-makers, create trends, report for bid and Business uses, and analyze throughout the enterprise through various avenues. There are currently several scattered processes to bring Contract data into various departments which can cause confusion and reconciliation issues. Teradata and ETL using Informatica are used for loading the contract details into DW. Data is being accessed by end user groups using OBIEE reports and dashboards.
Responsibilities:
- Worked with architect, business managers to understand the requirement and source system in order to prepare design documents specifying the various ETL approaches, pros and cons of the different approaches with suggestion of the best approach.
- Worked extensively on Informatica designer to design a robust end-to-end ETL process involving complex transformation like Source Qualifier, Lookup, Update Strategy, Router, Aggregator, Sequence Generator, Filter, Expression, Stored Procedure, External Procedure, Transactional Control for the efficient extraction, transformation and loading of the data to the staging and then to the Data Mart Data Warehouse checking the complex logics for computing the facts.
- Extensively used the reusable transformation, mappings and codes using Mapplets for faster development and standardization.
- Used reusable Session for different level of workflows.
- Involved in Performance Tuning in Informatica for source, transformation, targets, mapping and session.
- Created variables and parameters files for the mapping and session so that it can migrate easily in different environment and database.
- Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait and so on.
- Used FTP and NDM protocols to handle FLAT files with Mainframe server
- Design the architecture for production support in order to scheduling the tasks without any manual intervention using Unix Shell Scripting.
- Used Unix Command and Unix Shell Scripting to interact with the server and to move flat files and to load the files in the server.
- Performed file archiving using Unix Shell Scripting.
- Involved in resolving the issues related with the migration of designs from Dev Server to Prod Server and also helped in resolving issue related to the production support of the Data Warehouse.
- Wrote complex SQL queries PL/SQL function, procedure, packages, cursor and triggers to retrieve the data from sources system and to count and validate the data and data set.
- Actively involved in Data Migration from source to target Database.
- Perform Unit testing by writing simple test scripts in the database and involved in integration testing and minus SQL script comparing from source to target.
- Interacting with User Groups/Corporate Testing Groups/Business People.
Environment: Informatica Power Center 9.5, Oracle 10g/11g RAC, SQL, PL/SQL, Toad 11, SQL Developer, TOAD for Data Modelers, Mainframe for COBOL, F-Secure Shell, UNIX Shell Scripting, HP ALM, Harvest
Confidential
Informatica Developer
Confidential This project Global Payment Services is a detailed client and product profitability reporting and decision support tool for Global Treasury Services. GPS is split into CMV Client Market View and CFV Client Financial View . CMV focuses on product revenue and expense in the period the service was performed. CFV reports the revenue as it is booked by the Bank in the period it is booked. Client and product attributes are referred to as dimensional data. Dimensional files are sent to GPS-MIS Operations which extracts, transforms, and merges the data into one file for input to the GPS-MIS load environment. Balance and activity metrics are referred to as fact data. Fact interface files are sent directly and extracted from the sourcing application to the GPS-MIS load environment.
Responsibilities:
- Involved in requirement gathering from Business to understand the Data inputs.
- Worked closely to develop logical and physical data models that capture current state/future state data elements and data flow using ER-WIN.
- Analyzed the data model to fit in the ETL to load data into its input tables with proper link of surrogate keys. Actively involved in the Design and development of the STAR schema data model.
- Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement. Developed entity diagrams and data dictionaries to accomplish tasks.
- Design and Development of ETL routines using Informatica Designer within the Informatica Mappings, usage of Lookups, Aggregator, Ranking, Normalizer, Mapplets, connected and unconnected stored procedures / functions, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were extensively done.
- Extensively used Global Source/Targets, User Defined Function, Reusable Mapping and transformation. Used Incremental Aggregation technique to load data into Aggregation tables for improved performance.
- Extensively used Parameter file to override mapping parameter, mapping Variables, Workflow Variables, Session Parameters, Ftp Session Parameters and Source-Target Application Connection parameters.
- Used Pushdown Optimization to push the transformation logic on to the Database both on the Source and the Target sides where ever possible to improve performance of the mapping.
- Performed Risk and Gap Analysis Used Informatica Best Practice to handle Error handling - Logging the record level errors in the metadata tables and Auditing - Capture the source/target record counts in every phase of the process flow.
- Extracted data from source systems like flat files as per the requirements and loaded it to Teradata using FASTLOAD, TPUMP and MLOAD.
- Worked with Teradata SQL Assistant to analyze and BTEQ scripts to load the data from Teradata staging to Teradata Warehouse Tables.
- Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse Teradata tables using Informatica Workflow Manager.
- Written SQL commands Pre-Session Post-Session commands and executed them in the target database to drop the index create the index for the target table before and after loading data into it.
- Experience in creating Dynamic deployment groups and ETL Query for promoting up to higher environments.
- Involved in identifying bottlenecks in source, target, mappings and sessions and resolved the bottlenecks by doing Performance tuning techniques like increasing block size, data cache size, buffer length.
- Extensively used UNIX commands within Informatica for Pre Session and Post Session Data Loading Process. Used UNIX Shell Scripts to search for trigger file and do the loop to schedule the workflow.
- Developed the UNIX shell scripts to send out an E-mail on success of the process indicating the destination folder where the files are available. Used Autosys for scheduling the jobs.
- Involved in Unit testing, System, User Acceptance Testing to check whether the data is loaded into target.
Environment: Informatica 9.x, Oracle SQL, PL/SQL, SQL Toad, DB2, UNIX Shell scripting, Autosys, Teradata utilities BTEQ, MLOAD, FASTEXPORT, FASTLOAD
Confidential
Informatics Developer
Confidential a group of insurance and financial services companies in the Confidential. The main objective of the project is to develop and test data warehouse called DAL Data Access Layer for claims related data. The data from ECS OLTP application is first moved to CCD staging area using EL-T tool and then it's uploaded to Data marts from different sources using Informatica Power Center 8.6 as ETL Tool, IBM DB2 is the source of data as well as target database.
Responsibilities:
- Responsible to analyze functional requirement. Created design specification and technical specifications based on functional requirements.
- Extensively worked on developing Informatica Mapplets, Mappings, Sessions, Worklets and Workflows for data loading.
- Worked on transformations such as Lookup, Aggregator, Expression, Joiner, Filter, Rank, Sorter, Router, Sequence Generator, XML transformation etc.
- Created mappings of initial load for all source systems, cleansed data and load it into staging area.
- Working on Web Service Consumer transformation to gather metrological data.
- Extensively used ETL to load data from wide range of sources such as relational database, XML, flat files fixed-width or delimited etc.
- Expertise in writing Pre Post Sql's and overrides Sql's as per the requirement.
- Extensively worked on using the PDO Push down optimization , CDC Change data capture mechanism.
- Extensively used the capabilities of Power Center such as File List, pmcmd, Target Load Order, Constraint Based Loading, Concurrent Lookup Caches etc.
- Responsibilities included designing and developing complex Informatica mappings including Type-II slowly changing dimensions.
- Identifies, researches, and resolves ETL root cause of production issues or system problems.
- Worked with Pre-Session and Post-Session UNIX scripts for automation of ETL jobs using CONTROL-M schedulers and involved in migration/deployment of ETL codes from development to production environment.
- Responsible for creating the parameter files in order to connect to the right environment and data base.
- Responsible for monitoring sessions. Debugged mapping of the failed session.
- Handled Production Support for the monitoring of daily and Monthly jobs.
Environment: Informatica Power Center 8.6, IBM DB2, Control-M, PL/SQL, FileZilla, Windows, UNIX Shell Scripting, etc.
Confidential
Informatica Developer
Confidential a leading pharmaceutical company in Indianapolis. Project involved in Data Warehouse development for sales division, which contains the sales data for pharmaceuticals and medical products of the company. This Data Warehouse project will enable management to better leverage information collected within current operational systems to help in their decision making process.
Responsibilities:
- Gathered business requirements from business users and prepared functional specifications for various interfaces and Data Migrations.
- Developed ETL jobs to load data into XML files from Oracle database.
- Developed various mapping to migrate the data from Darwin to Smartlab.
- Developed various interfaces like Darwin to Smart lab bi-directional interface, MDI interface etc.
- Used Integration Manager to load data from XML files into Darwin system.
- Extensively worked on developing and modification of ETL Mappings using various Transformations like Source Qualifier, Expression, Lookup, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, and Router transformations in Informatica Designer.
- Extensively worked on creating Workflows for various Interfaces and Data Migration mappings.
- Scheduled and unscheduled workflows and used UNIX command tasks to automate the entire process of fetching the source file from a different path and ftp it onto the server.
- Involved in creation of target tables in Oracle and Informatica mappings and workflows and executing them through UNIX Shell Scripts.
- Involved in Testing of Interfaces and Data Migrations.
Environment: Informatica Power Center 8.1, Integration Manager, Oracle 10g , SQL, PL/SQL, Unix Shell Scripts Korn , Windows XP , MQ series and Flat Files.