- 7.5 Years of IT experience with hands on experience in ETL Development and Mainframe using Informatica PowerCenter 10.1.1/9.6.1/9.5.1/8.6.1, PowerExchange 9.5.1/8.6, Data Warehousing, PL/SQL, Oracle, Teradata.
- Experienced in Optimizing Database querying, data manipulation and population using SQL PL/SQL and Utilities in Oracle 12c/11g/10g/9i, Teradata 14/13/12, DB2 UDB and SQL Server 2012/2008/2000 databases.
- Hands on experience with Informatica administration and various upgrades.
- Experienced in Installation, Configuration, and Administration of Informatica Power Center Client.
- Worked hands on Datastage 8.7 ETL migration to Talend Studio ETL process.
- Experienced in ETL methodology for performing Data Migration, Data Profiling, Extraction, Transformation and Loading using Talend and designed data conversions from large variety of source systems including Oracle DB2, Netezza, SQL server, Teradata, Hive, Hana and non - relational sources like flat files, XML and Mainframe files.
- Strong understanding of OLAP and OLTP Concepts.
- Excellent in designing ETL procedures and strategies to extract data from different heterogeneous source systems like oracle 11g/10g, SQL Server 2008/2005, DB2 10, Flat files, XML, SAP R/3 etc.
- Experience in SQL, PL/SQL and UNIX shell scripting.
- Hands on experience working in LINUX, UNIX and Windows environments.
- Experience writing and modifying ETL design documentation, test results documentation and standard operating procedures (SOP) documentation.
- Proven experience with writing and technical requirements analysis.
- Good knowledge on data quality measurement using IDE.
- Extensive ETL experience using Informatica Powercenter (Designer, Workflow Manager, Workflow Monitor and Server Manager) Teradata and Business Objects.
- Worked with Teradata loading utilities like Multi Load, Fast Load, TPump and BTEQ.
- Experience in developing Shell scripts on UNIX and experience with UNIX command line.
- Knowledge of Software Testing and Quality Assurance.
- Designed Real-time Datawarehouse using CDC tool Informatica PowerExchange 9.5.1/8.6.1.
- Excellent communication and writing skills and capable of working successfully both in teams and independently.
- Ability to self-manage time and task priorities to meet project timelines, and identify potential project risks.
- Expertise in handling error, exception and performance issue in Informatica.
- Experience in Unix Shell Scripting for data manipulation and scheduling tasks.
Operating System:: UNIX, Linux, Windows XP/2000/98/95/Win 7Programming Languages: C, C++, HTML, PL/SQL, ASP.NET, Java, UNIX Shell Scripting, VB
Virtualization Tools:: VMware, VSphere, ESX/ESXi, Vcenter server.
Databases:: Oracle 11g/10g/9i, Teradata V2R4, DB2, Netezza, TOAD, MS SQL Server 2008, 2012, SAP BW, SAP Logon
ETL Tools:: Informatica Power Center 10.1.1/9.6.1/9.1/8.6, Informatica Data Quality (IDQ),Informatica MDM Data Director 10.0, 9.7, 9.5.
BI Tools:: Business Objects 6.x, Congo's 8.0
Database Tools:: Oracle 10g, 11g/12c RAC, Cassandra DB, MongoDB, DB2, MS Access, PL/SQL, MySQL, PostGre SQL
FTP Tools:: IP Switch, Win SCP
Web Technologies:: Servlets, JDBC, JSP, HTML, CSS, XML.
- Worked in the Data Integration Team to perform data and application integration with a goal of moving more data more effectively, efficiently and with high performance to assist in business critical projects coming up with huge data extraction.
- Perform technical analysis, ETL design, development, testing, and deployment of IT solutions as needed by business or IT.
- Participate in designing the overall logical & physical Data warehouse/Data-mart data model and data architectures to support business requirements.
- Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tOracleRow, tOracleInput, tOracleOutput, tMSSQLInput and many more.
- Analyzing the source data to know the quality of data by using Talend Data Quality.
- Troubleshoot data integration issues and bugs, analyze reasons for failure, implement optimal
- Solutions, and revise procedures and documentation as needed.
- Worked on Migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Netezza.
- Used SQL queries and other data analysis methods, as well as Talend Enterprise Data Quality
- Platform for profiling and comparison of data, which will be used to make decisions regarding how to measure business rules and quality of the data.
- Writing Netezza SQL queries to join or any modifications in the table.
- Used Talend reusable components like routines, context variable and global Map variables.
- Responsible to tune ETL mappings, Workflows and underlying data model to optimize load and • Query performance.
- Monitored and supported the Talend jobs scheduled through Talend Admin Center (TAC).
- Developed Oracle PL/SQL and Stored Procedures and worked on performance and fine tuning of SQL.
Environment: Talend 6.1/5.6, Netezza, Oracle 12c, IBM DB2, Aginity, Business Objects 4.1, SQL Server 2012, XML, Hive, Pig, SQL, PL/SQL, JIRA.
Confidential - Boston, MA
- Involved in gathering of business requirements, interacting with business users and translation of the requirements to ETL High level and Low-level Design.
- Documented both High level and Low-level design documents, Involved in the ETL design and development of Data Model.
- Worked in importing and cleansing of data from various sources like Teradata, Oracle, flat files, SQL Server 2008 with high volume data.
- Developed T-SQL Stored Procedures and Triggers and User defined Data Types and Functions to enforce the business rules.
- Due to memory space issues, replaced look up transformations with stored procedures and bridge tables.
- Worked on improving data warehouse solutions by translating business requirements into robust, scalable, and supportable solutions that work well within the overall system architecture.
- Used Calculations, Variables, Break points, Drill down, Slice and Dice and Alerts for creating Business Objects reports
- Developed complex ETL mappings and worked on the transformations like Source qualifier, Joiner, Expression, Sorter, Aggregator, Sequence generator, Normalizer, Connected Lookup, Unconnected Lookup, Update Strategy and Stored Procedure transformation.
- Extract, load and transform the data from Oracle to Green plum using Green plum Utilities like upload, External Tables, GPFdist and Data Stage.
- Created Queries, Query Groups and packages in MDM Hub Console
- Wrote T-SQL statements for retrieval of data.
- Implemented Slowly Changing Dimension Type 1 and Type 2 for inserting and updating Target tables for maintaining the history.
- Worked on loading the data from different sources like Oracle, DB2, EBCDIC files (Created Copy book layouts for the source files), ASCII delimited flat files to Oracle targets and flat files.
- Created Netezza views to be used in Micro strategy for semantic layer on data marts.
- Used Netezza Bulk writer and Netezza Bulk reader in Informatica to load and read data from Netezza.
- Responsible for Data mapping testing by writing complex T-SQL Queries using Query Studio
- Created Informatica mappings with PL/SQL Procedures/Functions and triggers in T-SQL to build business rules to load data.
- Involved in migration of data from Oracle to Netezza.
- Experience in working with Mapping variables, Mapping parameters, Workflow variables, implementing SQL scripts and Shell scripts in Post-Session, Pre-Session commands in sessions.
- Experience in writing SQL Loader scripts for preparing the test data in Development, TEST environment and while fixing production bugs.
- Converted Teradata syntax tables, views, functions and stored procedures to Green plum/Postgres syntax tables, views, functions using SQL & PL/PGSQL.
- Experience in using the debugger to identify the processing bottlenecks, and performance tuning of Informatica to increase the performance of the workflows.
- Experience in creating ETL deployment groups and ETL Packages for promoting up to higher environments.
Environment: Informatica Power Center 9.5.1/9.1, Oracle 11g/10g/9i, DB2, MS Access, TOAD 9.0, Netezza Twin fin 6.0, UNIX, Teradata
Informatica/ ETL Developer
- Extraction, Transformation and data loading were performed using Informatica into the database. Involved in Logical and Physical modeling of the drugs database.
- Designed the ETL processes using Informatica to load data from Oracle, Flat Files to target Oracle Data Warehouse database.
- Based on the requirements created Functional design documents and Technical design specification documents for ETL.
- Created tables, views, indexes, sequences and constraints.
- Developed stored procedures, functions and database triggers using PL/SQL according to specific business logic.
- Transferred data using SQL Loader to database.
- Involved in testing of Stored Procedures and Functions. Designed and developed table structures, stored procedures, and functions to implement business rules.
- Implemented SCD methodology including Type 1 and Type 2 changes.
- Used legacy systems, Oracle, and SQL Server sources to extract the data and to load the data.
- Involved in design and development of data validation, load process and error control routines.
- Used pmcmd to run workflows and created Cron jobs to automate scheduling of sessions.
- Involved in ETL process from development to testing and production environments.
- Analyzed the database for performance issues and conducted detailed tuning activities for improvement
- Generated monthly and quarterly drugs inventory/purchase reports.
- Coordinated database requirements with Oracle programmers and wrote reports for sales data.
Environment: Informatica Power Center7.1, Oracle 9, SQL Server 2005, XML, SQL, PL/SQL, UNIX Shell Script.
- Design and Implement ETL for data load from Source to target databases and for Fact and Slowly Changing Dimensions (SCD) Type1, Type 2, Type 3 to capture the changes.
- Worked on Power Centre client tools like Source Analyzer, Warehouse Designer, Analyzed the system, met with end users and business units in order to define the requirements • Extracted the data from Oracle, SQL Server and load into Data warehouse.
- Wrote SQL Queries, Triggers, PL/SQL Procedures, Packages and Shell Scripts to apply and maintain the business rules.
- Translated business requirements into Informatica mappings/workflows.
- Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
- Used Informatica Designer to create complex mappings using different transformations like filter, Router, lookups, stored procedure, joiner, update strategy, expression and aggregator transformations to pipeline data to Data Warehouse/Data Marts.
- Developed Mappings that extract data form ODS to Data mart and Monitored the Daily and Weekly Loads.
- Created and monitored Sessions/Batches using Informatica Server Manager/Workflow Monitor to load data into target Oracle database.
- Provided support to develop the entire warehouse architecture and plan the ETL process.