Senior Informatica/ETL Developer with 7.6 years of IT experience in Business Requirements Analysis, Application Design, Development and Implementations of Data warehousing Applications using ETL tools, Informatica Power Center 9.1/8.6.1/8.6.0/8.5/8.1.1/7.1.4/6.5
- Worked on Heterogeneous Databases like Oracle 10g/9i/8i/8.x, Teradata V2R5/V2R4,DB2, MS SQL Server 2008/7.0/6.5 , MS Access 7.0/2000, SQL, XML, PL/SQL, SQL*Plus, SQL*Loader, Developer 2000, Windows 3.x/95/98/2000, TOAD.
- Extensive integration experience in working on different databases like Oracle, MS Sql, SAS, flat files.
- Extensively worked on Informatica Designer Components - Source Analyzer, Warehousing Designer, Transformations Developer, Mapplet, Mapping Designer, Workflow Manager, Workflow Monitorand implementing SCD Type 2 using Informatica Power Exchange.
- Strong Experience working as Data Analysis, User Requirement Gathering Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis. Creating Mapping documents.
- Experience in planning, building, and managing successful large-scale Data Warehouse and decision support systems.
- Comfortable in both technical and functional applications of RDBMS, Data Mapping, Data management, Data transportation and Data Staging.
- Expertise in building Enterprise Data Warehouses (EDW), Operational Data Store (ODS), Data Marts and Decision Support Systems (DSS) using Multidimensional and Dimensional modeling (Star and Snowflake schema) Concepts.
- Extensive working experience in database design and development, for creating complex database queries, writing Constraints, Indexes, Views, and Stored Procedures and Functions in SQL Server 2000/2005/2008/2012 and MySQL.
- Experience in using UNIX commands and writing Shell Scripts.
- Worked on integrating web services in Informatica and Pl/Sql.
- Used SQL, PL/SQL in writing StoredProcedures, Functions, Packages and Triggers.
- Experience in Agile Scrum methodology, Planning, Scheduling and Defining the user stories and tasks for the team.
Informatica 6.x/7.x/8.x/9.1/9.5, Data Stage, Power Exchange, Teradata 12/13.10, Oracle 11g/10G, SQL Server 2000/7.0/6.5 , MS Access, PL/SQL, SQL*Plus, SQL*Loader, IBM DB2 UDB 8.1/7.0.Business Objects 5.1/6.5/XI release 2(Web-Intelligence 2.5, Designer 5.0), Cognos 7, Microsoft SQL Server Analysis, SAS. Dimensional Data Modeling, Star Join Schema Modeling, Snow Flake Modeling, FACT & Dimensions Tables, Physical & Logical Data Modeling, Erwin 3.5.2/3.x, Microsoft Visio, OBIEE, Micro Strategy Tool. SQL, PL/SQL, UNIX Shell Scripting, UML, XML, T-SQL, Sybase12.x/11.x, Putty, F-Secure SSH Client 5.3, SQL* Plus, TOAD, Eclipse, SQL*Loader, FTP, Autosys. UNIX, Sun Solaris 5.8/5.6, AIX 5.3/4.3, HP-UX, DOS, Linux, Windows 98/NT/2000/XP/Vista/7, SOAP
SUMMARY OF EXPERIENCE:
Senior Informatica/ETL Developer
Confidential, Santa Clara, CA
Confidential is a full-service semiconductor foundry with a truly global manufacturing and technology footprint through collaboration and innovation. The cdw.2.0 project was designed to develop and maintain Data Marts. We have to upload the data from various Applications and load the data in different systems using ETL Tools. Worked on Informatica Power Center and developed individual workflows in Informatica.Responsibilities:
- Involved in Design Specifications and technical Sessions. Understood the Business point of view to implement coding using Informatica power center designer.
- Designed and developed ETL Processes based on business rules using Informatica Power Center
- Worked extensively on complex mappings using source qualifier, joiner, expressions, aggregators, filters, and lookup and update strategy transformations.
- Integrated all jobs using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
- Design, Development, Deployment and Production Support.
- Developed mappings to load into staging tables and then to Dimensions and Facts.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Experience on Performance tuning to increase the throughput for both mapping and session level and SQL Queries Optimization as well.
- Proficient in ETL (Extract - Transform – Load) using SQL Server Integration Services 2005 (SSIS) and Informatica Power Center tool.
- Performed analysis/design/development/unit testing for Power Center 8.6/9.1.
- Performed unit testing, point-to-point testing and integration testing to confirm ETL code integrity.
- Enhanced performance for Informatica session using large data files by using partitions, Increasing block size, data cache size, sequence buffer length and target based commit interval
- Sourced large volumes of data from multiple data sources (Relational and Flat File) such as Relational Table, and Flat File into staging area using Informatica Power Center.
- Proactively work with other Data Warehouse team members and with the members of other teams (business customers, analysts, developers, DBA’s and technical support staff) to create innovative analytical solutions.
- Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup, and Router
- Designed ETL process using Informatica Tool to load from Sources to Targets through data Transformations
- Involved in ETL process from development to testing and production environments. Used Debugger for debugging Mappings.
- Gained working knowledge of ControlM Scheduling Tool for loading/force starting jobs, changing job status and monitoring job progress. Worked with Team to design the Architecture of ControlM as which agent will be installed on what server, how many agents do we require, etc.
- Designed and implements the error handling strategy for ETL team.
- Monitored data warehouse month-end loads to ensure successful completion.
Environment: Informatica Power center 8.6/9.1, ControlM Scheduler 7.0, ControlM Desktop, ControlM Enterprise Manager, Toad, PL/SQL (Stored Procedure, Trigger, Packages), Oracle 9i, 10g, SQL server 2005, OBIEE, Ab Initio, Erwin.
Senior Informatica/ETL Developer
Confidential, Greensboro, NC
Confidential is a global apparel and footwear company with more than 30 brands. The Data Warehouse aggregates fashion trends and sales information from a variety of sources around the globe and make it accessible in real time. Analysis is done on merchandising, buying and trading. Daily and weekly reports are generated to help track the competition and refine their product planning.Responsibilities:
- Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
- Working with Informatica Meta Data Manager in analyzing the production data issues.
- Working with Informatica Data Quality (IDQ) tool to build profiles and ensure data quality to the consumers.
- The 2nd process of stage 1 involved generating statistical reports in SAS, which could be read by application software used by the analytical Team.
- Collect and integrate the different data feeds which is useful to compile sales data
- Involved in Data Profiling and Data cleansing process.
- Responsible for determining the Mapping bottlenecks with Informatica and fixing the issues by tuning the Mappings for better performance.
- Coordinate production change requests and production releases.
- Troubleshooting load failure issues and data quality issues on a day to day basis.
- Maintain the daily ETL schedule and recover the daily failures and generate the daily reports for users.
- Using SQL Maintained and enhance the existing data warehouse, exports, and reports.
- Used Informatica Debugger to troubleshoot data and error conditions.
- Maintain documents for all the Development work done and error fixings performed.
- Involved in deploying and redesigning of several ETL processes for the existing research line of business
- Used Debugger wizard to troubleshoot data and error conditions.
- Experience in preparing reports on the performance of the UNIX operating systems and applications run on these systems.
- Worked on writing codes for languages such as C, Perl, Shell etc.
- Involved in training the users about working on various UNIX based applications.
- Responsible for Best Practices like naming conventions, and Performance Tuning.
- Developed Reusable Transformations and Reusable Mapplets.
- Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
- Used tools like TOAD and SQL navigator to run the queries and validate the data loaded into Marts and different other layers like Data Warehouse and Persistent Staging.
- Involved in taking Knowledge Transfer from a different vendor before two weeks of Go Live and handled different performance issues affectively.
- Responsible to Run Unix scripts to startup the Java Web Service to generate the Load confirmation emails to users.
- Coordinate with Offshore team on daily basis to handle production support issues and get daily updates from the team on the development work.
Environment: Informatica Power Center (Repository Manger, Designer, Workflow Manager, Workflow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, SAS, IDQ, Task developer, Worklet Designer), Power Exchange (SAP), UNIX, oracle 9i, 11g, SQL, PL/SQL, TOAD, Informatica Meta Data Manager, Informatica Data Quality, Erwin, TERADATA.
Confidential, Towson, MD
Confidential is a leading provider of healthcare data management, analytics, decision support, process automation and related information technology solutions. The Early Retiree Reinsurance Program (ERRP) provides reimbursement to participating employment-based plans for a portion of the costs of health benefits for early retirees and early retirees’ spouses, surviving spouses, and dependents. The program was authorized in the Affordable Care Act.Responsibilities:
- To obtain the data about the customers from different systems and aggregate within the data warehouse using Informatica
- Worked with data modelers in preparing logical and physical data models and adding/deleting necessary fields using Erwin
- Implemented populate slowly changing dimension to maintain current information and history information in dimension tables.
- Worked on different data sources such as Teradata, Oracle, DB2, Flat files, etc.
- Used UNIX korn shell scripts to load data from flat files into Teradata Development and Production environments. Created UNIX scripts to pre-process the flat files before the Mload/FastLoad process
- Created and scheduled jobs on Windows to extract data from Oracle, DB2, SQL Server and Excel using the utility OLELoad to load into Teradata.
- Wrote DB2 stored procedures for implementing business rules and transformations.
- Involved in the preparation of Technical design documents, Source to target (S2T) document, Review checklist and Program Specifications or Technical Specifications.
- Developed and tested Store procedures, Functions and packages in PL/SQL for Data ETL.
- Prepared test cases and involved in unit testing of mappings, system testing and user acceptance testing.
- Written documentation to describe program development, logic, coding, testing, changes and corrections.
Environment: Informatica Power Center 9.X/8.6.1,OBIEE, UDB DB2 8.1, Oracle 10g/9i, SQL, PL/SQL, XML, Cognos Series 8.3/7.0, MS Access, Windows 2003, UNIX, Business Objects 6.5,Solaris 10.
Confidential is a fully integrated bio and pharmaceutical services provider offering clinical, commercial, consulting and capital solution. It navigates risk and seize opportunities in an environment where change is constant.Responsibilities:
- Work experience with Informatica Power center 8.5.1/8.1.1.
- Created connected, unconnected and dynamic lookup for better performance.
- Involved in fine tuning of sql queries in the transformations
- Created reusable Mapplets and transformations in informatica.
- Developed and Implemented Informatica parameter files to filter the daily data from the source system.
- Developed workflow tasks like reusable Email, Event wait, Timer, Command, Decision.
- Good Work Experience in various types of testing like Integration testing, performance testing, parallel testing.
- Experience in integration of various data sources like Oracle, DB2, SQL server and MS access into staging area.
- Have experience in developing mappings according to business rule, migrating to QA AND production, naming conventions, mapping design standards and good sound knowledge in DATAWAREHOUSE, PL/SQL concepts, ODBC connections etc.
- Used Ralph Kimball Methodology for building the data warehouse.
- Good Experience in Repository Administration, Backups and creation of User Groups.
- Creating Test case documents for Unit Test, Integration Test to check the data and Mapping documents.
- According to business logic created various transformations like Join, Expression, Aggregate, Rank, Lookup, Update Strategy, Filter and Router Transformations.
- Created Schema objects like Indexes, Views, and Sequences.
- Created Tasks, Workflows and Worklets using Workflow Manager. Monitoring of Workflows and Worklets using Workflow Monitor.
- Tuned performance of Informatica sessions for large data files by increasing block size, data cache size and, sequence buffer length.
Environment: Informatica Power Center 8.5/8.1(Workflow Manager, Workflow Monitor, Worklets, Source Analyzer, Warehouse designer, Mapping Designer, Mapplet Designer, Transformations), Power Exchange Tool, Oracle 10g, Quest Central, SQL Developer, SQLplus, PL/SQL (Stored Procedure, Trigger, Packages), Teradata, DB2, Erwin, MS Visio, Windows 2000, UNIX HP-UX
The framework captures data daily from each of the client's various locations. The organization organizes this consistent data storage for their global level, region level and functional level data analysis. This data was used for creating a baseline for data marts. With this centralized data infrastructure, the client could view its sales information more accurately.Responsibilities:
- Involved in the Data warehouse Data modeling based on the client requirements.
- Created Data Warehouse Data Modeling used by Erwin.
- Coordinating with the client and gathering the User requirements.
- Created Star and Snowflake Schema for the Data Model Designing.
- Good experience in dealing with Data conversion functions, Extracting from sources, Transforming according to logic and loading into target tables using Informatica
- Used Timestamp on Rows, Version numbers on Rows, Triggers on tables, status indicators on Rows, SCD (type1, type 2, type3) as CDC solutions. This Change Data Capture is used to simplify ETL in data warehousing applications.
- Identified the Dimensions and Facts for the Data Modeling
- Created the Entity Relationship for the Facts and Dimensions for the Data Modeling Design
- Data Mart used is Sales Data Mart with target database as ORACLE.
- Good experiencing in Importing sources from Teradata, Oracle, Flat files and SQL server databases.
- Used query tools like Toad, AQT for Data Accessing i.e. to store and retrieve the data warehouse database objects.
- Creating Users, Folders and assigning the privileges to them for the Security also administration of repository.
- Designed, developed, demonstrated major Businesses Intelligence tools like Cognos, Micro strategy to drill down, slice and dice, pivot, and analyze for the ABDC Sales data.
- Performance tuning on sources, targets, mappings and SQL queries in the transformations
- Done various optimization techniques in Aggregator, Lookup, Joiner transformation
- Created reusable Mapplets and transformations in Informatica.
- Developed mapping to implement type 2 slowly changing dimensions
- Migrating the mappings and workflows from development server to Production Server.
- Automated the ETL Process
- Did the Root cause Analysis to find the problem.
- Responsible for loading data into warehouse from different sources using Loader Utility to load millions of records.
Environment: Informatica Power Center 7.1.3, Business Objects 6.1,DB2 UDB 8.1,Oracle 10g/9i,SQL Server 2000, Mainframe DB2, COBOL Files, Power Exchange, AS/400, Erwin, MS Visio, Advanced Query Tool, Windows 2000, UNIX AIX 5.1
This Data warehouse was designed for the Financial Reporting for the COMPAQ Financial Services. This warehouse reports the financial historical data stored in various databases and Flat Files. Data from different sources should be brought into Oracle using Informatica ETL.
- Used ETL using Informatica (Power Center) to load data from MS SQL Server6.5, Sybase to the target Oracle8i.
- Worked on Informatica Source Analyzer, Data warehousing designer, Mapping Designer and Transformations.
- Worked with different transformations like Stored Procedure, Filter, Expression, Aggregator and Joiner.
- Involved in the development of Informatica mappings and also tuned them for better performance.
- Created Informatica mappings with PL/SQL procedures/functions to build business rules to load data.
- Created Catalogs, Filters, Calculations, and Prompts using Impromptu Administrator.
- Created different report files like IMR, IQD and Created Power cubes in Transformer.
- Created transformations like Stored Procedure, Aggregate, and Expression.
- Involved in Logical and Physical modeling using ERWIN.
- Tuned the mappings for better performance maximum efficiency.
Environment: Informatica Power Center7, Cognos Impromptu Administrator, Transformer, IWR, Oracle 8i, SQL, PL/SQL, IIS Server 5.0, Microsoft Analysis Server 2000, JSP, XML, MS SQLServer2000, ERWIN 4.0, Microsoft PowerPoint, Visio, Word, Excel, UNIX-HP, Windows NT 4.0.