Sr. Informatica Etl Developer Resume
SUMMARY:
- Have 7+ years of IT experience working in Database/Data Warehousing Technology, analysis, designing, development and implementation of business systems in various environments.
- 6+ years of experience in development and maintenance of SQL, PL/SQL, Stored procedures, functions, constraints, indexes and triggers.
- 6+ years of experience in using different versions of Oracle database like 10g/9i/8i.
- Around 4+ years of experience in UNIX Shell Scripting.
- Experience in creating batch scripts in DOS and Perl Scripting.
- Experience in ETL development process for Data Warehousing, Data migration and Production support.
- Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in Pharmaceutical, Health Care, Insurance, Financial and Telecommunication industries.
- Involved in requirement gathering, design, development and implementation of data warehouse projects.
- Sound knowledge of Relational and Dimensional modeling techniques of Data warehouse (EDS/Data marts) concepts and principles (Kimball/Inmon) – Star/Snowflake schema, SCD, Surrogate keys and Normalization/De-normalization.
- Data modeling experience in creating Conceptual, Logical and Physical Data Models using ERwin Data Modeler.
- Experience with TOAD to test, modify, analyze data, create indexes, and compare data from different schemas.
- Extensively worked on Development using SQL Developer, SQL*Loader and TOAD tools.
- Extensive experience in designing and creating data marts using ETL tool Informatica.
- Expertise in designing and developing ETL Informatica Mappings, Sessions and Workflows using Informatica 7.1, 8.1, 8.6 and 8.6.1 (Designer, Workflow Manager and Workflow Monitor).
- Extensive experience in developing Extracting, Transforming and Loading (ETL) processes using Informatica PowerCenter and full life cycle implementation of data ware house.
- Worked on Slowly Changing Dimensions (SCD\'s) Types -1, 2 and 3 to keep track of historical data.
- Knowledge in Data Analyzer tools like Informatica Power Exchange (Power Connect) to capture the changed data.
- Proficiency in data warehousing techniques for data cleansing, surrogate key assignmentand Change data capture (CDC).
- Proficient with Informatica Data Quality (IDQ) for cleanup and massaging at staging area.
- Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge onTeradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
- Proficient in using Informatica tools like Informatica Designer, Repository Manager, Workflow Manager and Workflow Monitor.
- Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
- Optimized the Solution using various performance tuning methods (SQL tuning, ETL tuning (i.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Hints, Indexes, partitioning, Materialized Views, External Table, Procedures, functions).
- Extensively performed Performance Tuning of sources, targets, mappings and sessions.
- Experience in UNIX shell scripting and configuring Cron jobs for Informatica job scheduling, backup of repository and folder.
- Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.
- Experience in using the Informatica command line utilities like pmcmd, pmrepserver and pmrepagent to execute or control sessions, tasks, workflows and deployment groups in non-windows environments.
- Extensive knowledge in reporting tools such as Cognos and BussinessObjects.
- Extensive knowledge in creating universes, Reports, Dashboard KPI’s and Adhoc reports using Business objects.
- Extensive knowledge in all areas of Project Life Cycle including requirements analysis, system analysis, design, development, documentation, testing, implementation and maintenance.
- Strong analytical, verbal, written and interpersonal skills.
- Proactive in learning new technologies and updating my skills.
- Data Modeling Tools: ERwin, MS Visio.
- Data Management and Integration: VLTrader V4.2
- ETL Tools: Informatica PowerCenter 8.x/ 7.x/6.x, PowerExchange, IDE/IDQ.
- OLAP/Reporting Tools: Cognos 8.1, Business Objects XI/6.5.
- Databases: Oracle 10g/9i/8i, SQL Server 2000/2005/2008, Teradata 12.0/13.0, DB2 (UDB) 8.1.
- Database Utilities: WinSQL, Toad, SQL*Loader, SQL Developer, SQL*PLUS, WinSCP.
- Operating Systems: UNIX (Solaris 8/10), LINUX, Windows NT, 2000, XP, 2003, Win server 2008.
- Languages: T-SQL, SQL, PL/SQL, C, C++.
- Scheduling Tools: Tidal, Autosys, Crontab and Control-M.
- Methodologies: Data Modeling–Logical/Physical/Dimensional, Star/Snowflake schemas, Fact and Dimension Tables, Software Development Life Cycle.
EXPERIENCE
Sr. Informatica ETL developer Jan 2012- Till Date
Confidential, Parsippany, NJ
Responsibilities:
- Analyzed the business requirement and designed the ETL process to generate and integrate the processed claims data for the Contract Rebate System Integration Project of DSI with a vendor.
- Source was CARS database on Oracle and targets were pipe-delimited flat files.
- Edited the views and created new views according to the business requirement.
- Designed the staging table’s structure and created the tables on Oracle database to minimize the ETL design mappings complexity.
- SQL Query override used in Source Qualifier transformation and lookup transformations to reduce the complexity of the mappings.
- Worked with Expression, lookup, filter transformations and flat file targets.
- Parameterized the whole process by using the parameter file for the variables.
- Used command task for the recreation of parameter files.
- Created the deployment groups in development environment.
- Tidal Scheduler was implemented for scheduling of Informatica workflows.
- Interact with the vendor and set up the SFTP connection to the vendor’s ftp site for transferring extracted files.
- MFT (managed file transfer) was used for the transmission of the extracted files to the vendor.
- Set up SFTP connections to various vendors for receiving and sending of encrypted data as part of MFT Integration team support.
- Involved in Unit testing and system integration testing (SIT) of Informatica and MFT projects.
- Created the mapping specification, workflow specification and operations guide for the Informatica projects and MFT run book as part of end user training.
- Involved in knowledge transfer sessions with off-shore team for the above mentioned projects.
Environment: Informatica PowerCenter8.6.1/ 8.1.1, Windows Server 2008, MS-SQL Server 2005, Batch Scripting, Oracle Database 10g, SQL Developer 3.0.04, Flat Files, MFT server (Managed File Transfer), VLTrader V4.2, Tidal 5.3.1.
Sr. Informatica Software developer Sep 2011-Dec 2011
Confidential, St. Louis, MO
Responsibilities:
- Developed the mapping specifications from the business requirements and the data model.
- Created Mappings to load data using various transformations like Source Qualifier, Sorter, Lookup, Expression, Router, Joiner, Filter, Update Strategy and Aggregator transformations.
- Worked specifically with the Normalizer Transformation by converting the incoming fixed-width files to COBOL workbooks and using the Normalizer transformation to normalize the data.
- Worked with Lookup Dynamic caches and Sequence Generator cache.
- Created Reusable Transformations and Mapplets to use in Multiple Mappings and also worked with shortcuts forvarious informatica repository objects.
- Enhancement of existing mappings and fixing and adding new transformations and new logic to them.
- Worked on all aspects of Teradata loading and query tuning activities.
- Used Teradata utilities like FastLoad, MultiLoad and Teradata SQL Assistant.
- Created DDL’s to create tables and views on Teradata.
- Split the source file using shell script and parse it to make independent files and load onto Teradata.
- Load data files coming from external vendors onto Teradata EDW using mload and fload utilities.
- Even used Teradata Parallel Transporter (TPT) for loading pipe delimited, comma delimited and fixed-width data files onto Teradata EDW.
- Created jobs and job variable files for Teradata TPT and load using tbuild command from command line.
- Worked with Informatica PowerExchange to pull the changed data in the form of Condense files and load into Teradata tables using Tpump import.
- Created Data Breakpoints and Error Breakpoints in InformaticaDesigner Debugger for debugging the maps and fix them.
- Converted workflows which used Relational connections to connect to Teradata with mload connections which drastically reduced the overall workflow run-time and improved performance.
- Created parameter files with Global, mapping, session and workflow variables.
- Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
- Used sed, a UNIX utility in various shell scripts.
- Worked with Post-session on-success variable assignment in Informatica Workflow manager to assign variable values from the mapping to the workflow variables.
- Created re-usable, non-reusable sessions and Email tasks for on success/on failure mails.
- Analyzed the session logs, loader logs for Teradata mload and fload for errors and troubleshoot them.
- Scheduling of Informatica workflows using Tidal Scheduler.
- Migrate the code (workflows) from DEV to TEST Repositories in Informatica by creating deployment groups, folders, applying labels, creating queries in the Informatica Repository Manager.
- Maintenance and support for Informatica projects in Production.
- Team tag-up call in mornings with BO support to check on thelatest development on the projects in twice a month releases for Production.
- Available on call and provide assistance through VPN.
Environment: Informatica PowerCenter 8.6.0/8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository Manager), Informatica PowerExchange, UNIX, Teradata13.0, Shell Scripting including ‘sed’, Pipe- delimited, CSV, Fixed-width Flat Files, Tidal 5.3.1.
Sr. Informatica ETL developer Feb 2011- Aug 2011
Confidential, Parsippany, NJ
Responsibilities:
- Analyzed the business requirement to generate the XML output for the Campaign Management project.
- Worked with XML targets for the data coming from SQL server source.
- Query tuning and SQL Query override used in Source Qualifier transformation to pull historical data from database not earlier than the given date.
- Worked with Expression, lookup, sequence generator, Normalizer, router transformations,
XML and flat file targets.
- Extensively used Perl scripts to edit the xml files and calculate line count according to the client\'s need.
- Worked with re-usable sessions, decision task, control task and Email tasks.
- Parameterized the whole process by using the parameter file for the variables.
- Used command task for the recreation of parameter files.
- Imported xsd file to create the xml target and create the Hierarchical Relationship and normalized views.
- Edited the views and created new views according to the business requirement.
- Designed and build a web services integration to download Forex currency data to flat files.
- Implemented the logic by using HTTP transformation to query the web server.
- Configure and setup a secure FTP connection to the vendor using the Informatica Managed File transfer software.
- Created complex Batch scripts for various set of actions in the MFT (managed file transfer) that would automate the process of executing the actions like validating the presence of indicator files and concurrently running the actions for successful transmission.
- Encrypting and compressing the xml and flat files using PGP Compression algorithm like ZIP and PGP Encryption algorithms like TripleDES, Blowfish, AES-256 and CAST5 using MFT.
- Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
- Error solving in the mapping done by analyzing the session and workflow logs.
- Involved in Unit testing and system integration testing (SIT) of the projects.
- Created the deployment groups in development environment.
- Created jobs for automation of the Informatica Workflows and for DOS copy and moving of files using Tidal Scheduler.
- Created the mapping specification, workflow specification and operations guide for the Informatica projects as part of end user training.
- Maintenance and support for the above said projects in Production environment.
Environment: Informatica PowerCenter8.6.1/ 8.1.1, Windows Server 2008, MS-SQL Server 2005, Batch Scripting, Perl Scripting, XML Targets, Flat Files, MFT server (Managed File Transfer), VLTrader V4.2, Tidal 5.3.1.
Sr. Informatica Developer Jan2010-Jan2011 Confidential, Pleasanton, CAResponsibilities:
- Analyze business requirements to build a data mart design for various business processes conformed to the business rules.
- Developed the mapping specifications from the business requirements and the data model.
- Proficient in writing SQL queries and creating PL/SQL stored procedures.
- Involved in Developing OLAP models like facts, measures and dimensions.
- Implemented change data capture(CDC) using Informatica PowerExchange to update tables in the oracle database.
- Used FTP services to retrieve Flat Files from the external sources and RDBMS sources Teradata and Oracle.
- Created Mappings to load data using various transformations like Source Qualifier, Sorter, Lookup, Expression, Router, Joiner, Filter, Update Strategy and Aggregator transformations.
- Created Data Breakpoints and Error Breakpoints for debugging the mappings.
- Created Reusable Transformations and Mapplets to use in Multiple Mappings and also worked with shortcuts.
- Implemented Type2 slowly changing dimensions (SCD).
- Designed complex mappings involving target load plan for pipeline loading and constraint based loading.
- Worked on all aspects of Teradata loading and query tuning activities.
- Used Teradata utilities like FastLoad, MultiLoad, FastExport and BTEQ.
- Extracted data from Teradata source systems to a flat file.
- Created parameter files and scheduled workflows.
- Worked with both serial and parallel loading and created re-usable, non-reusable sessions and Email tasks for on success/on failure mails.
- Worked with Workflow Manager for Creating, Validating, Testing and running the sequential and concurrent Batches and Sessions.
- Identified the errors by analyzing the session logs and solved them.
- Optimized mappings by performing Performance Tuning and informatica best practices.
- Maintenance and support for Informatica interfaces/projects in Production.
- Used pmcmd command to run workflows through shell scripts and command line.
- Created various UNIX Shell Scripts for pre/post session commands for automation of loads using Autosys.
- Involved in Unit testing and system integration testing (SIT).
- Created mapping documentation and user guides for ETL processes as part of end user training.
- Created and maintained Custom, Cross Tab and Drill Through reports for the client using Business Objects and generated reports using BO tools by querying the database.
Environment: Informatica PowerCenter 8.6, PowerExchange, Oracle 10g, Teradata 12.0 , SQL, PL/SQL, ERwin, Toad 9.6, WinSCP, UNIX (Solaris 10), Autosys, Business Objects XI/6.5.
Informatica Developer Nov2008-Dec2009 Confidential, NYC, NYResponsibilities:
- Created technical design specification documents for Extraction, Transformation and Loading Based on the business requirements.
- Developed logical data models and physical data modelswith experience in Forward and Reverse Engineering using ERwin.
- Designed and developed source to target data mappings. Effectively used informatica Best Practices / techniques for complex mapping designs.
- Experience in using FTP services to retrieve Flat Files from external sources.
- Worked with wide range of sources such as delimited flat files, XML sources, MS SQL server, DB2 and Oracle databases.
- Created complex mappings to transform the Business logic using Connected/Unconnected Lookups, Sorter, Aggregator, Update Strategy, Router and Dynamic lookup transformationsfor populating target tables in an efficient way.
- Used Power Center Workflow Manager to create workflows, sessions and also used various tasks like command, event wait, event raise, Email to run with the logic embedded in the mappings.
- Involved in creating and modifying UNIX shell scripts and scheduling through Crontab.
- Developed PL/SQL stored procedures for source pre load and target pre load.
- Developed sessions using Informatica Server Manager and was solely responsible for the daily loads and handling of the reject data.
- Created Shell Scripts to execute the sessions using pmcmd command line interface.
- Analyzed the session logs and solved the errorsfor successful completion of workflows.
- Troubleshoot mappings and performance tuning of mappings by identifying source, target and mapping bottlenecks.
- Used techniques like source query tuning, single pass reading and caching lookups to achieve effective performance in less time.
- Supported the Quality Assurance team in testing and validating the Informatica workflows.
- Involved in Unit testing and system integration testingand even conducted peer reviews.
- Provided support in creating List Reports, Cross Tab Reportsand Ad hoc reports using Cognos.
Environment: Informatica PowerCenter 8.6, Oracle 10g/9i, DB2 (UDB) 8.1, MS SQL Server 2008, PL/SQL, TOAD 7.6, XML, ERwin, WinSQL, WinSCP, Crontab, HP-UX 10.20, Windows NT 4.0/2000, Cognos 8.1.
Informatica DeveloperOct2007–Nov2008 Confidential, Cranbury, NJResponsibilities:
- Understanding of the business requirements and enhancing the existing data warehouse architectural design for a better performance.
- Involved in High level design (HLD) and Low level design (LLD) documents.
- Used utility like SQL*Loader to load bulk data.
- Extensively used PL/SQL scripts for transformations.
- Developed new Mapplets and data transformation functions using Informatica PowerCenter to transform and populate data into data warehouse target tables from Oracle source tables as well as flat files and MS Access.
- Designed various mappings using transformations like Lookup (Connected and Unconnected), Router, Update Strategy, Filter, Sequence Generator, Joiner, Aggregator and Expressiontransformations.
- Developed reusable transformations and mapplets.
- Designed complex mappings involving target load order and constraint based loading.
- Configured and ran the Debugger from within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.
- Created and fine tuned sessions by creating performance details on the Informatica server for a session and evaluated the performance details and identified the reason for slow performance in the source and the target databases.
- Extensively used various performance tuning techniques like pipeline partitioning to improve the session performance.
- Used pmcmd command to run workflows from command line interface.
- Set up batches and sessions to schedule the loads at the required frequency.
- Created UNIX Shell Scripts for pre/post session commands for automation of loads using Autosys.
- Actively involved in end user training and support.
Environment: Informatica PowerCenter 8.6, Oracle 10g, MS Access 2007, Toad, PL/SQL, ERwin, WinSQL, WinSCP, UNIX, Windows XP, Autosys.
Informatica Developer Nov2006–Jul2007 Confidential, Houston, TXResponsibilities:
- Extraction, Transformation and Load was performed using Informatica to build the data warehouse.
- Accomplished automated data extraction from various RDBMS via scripts, ETL processing using Informatica and loading into Oracle Data warehouse.
- Source data was extracted from Oracle, SQL Server, flat files, COBOL sources and XML sources.
- Extensively worked with Informatica Designer, Workflow Manager and Workflow Monitor.
- Worked with Source Qualifier, Sorter, Aggregator, Expression, Joiner, Filter, Sequence generator, Router, Update Strategy, Lookup and Normalizer transformations.
- Created complex joiners, transformations of all types as needed to smoothly pass data through ETL maps.
- Implemented Type1 and Type2 slowly changing dimensions (SCD) logic.
- Configured and ran the Debugger from within the Mapping Designer to troubleshoot the mapping before the normal run of the workflow.
- Design and development of PL/SQL Stored Procedures using dynamic SQL for checking existence of partitions and creating them with appropriate values in the case of their absence.
- Used TOAD to evaluate SQL execution.
- Performance Tuning of the target, source and mapping definitions was undertaken.
- Used Workflow Manager forCreating, Validating, Testingand running the sequential and concurrent Batches and Sessions.
- Extensively used SQL and PL/SQL Scripts and worked in both UNIX and Windows Environment.
- Performed Unit testing and Integration testing.
- Involved in creating and modifying UNIX Korn shell scripts and scheduling the UNIX scripts through Control-M.
Environment: Informatica PowerCenter 8.1, Oracle 10g, SQL Server 2005, COBOL, Flat files, Toad, SQL Developer, WinSCP, SQL, PL/SQL, Control-M, UNIX, Window XP, ERwin.
EDUCATION:
Bachelor of Technology in Electronics and Communication Engineering