Etl/informatica Developer Resume
AL
SUMMARY
- Over 8 years of experience in Information Technology including Data Warehouse/Data Mart development using ETL/Informatica Power Center across various industries such as Healthcare, Insurance, Pharmaceutical, Energy and Banking.
- Extensive experience in using Informatica Power Center 9.x/8.x/7.x to carry out the Extraction, Transformation and Loading process as well as Administering in creating domains, repositories and folder.
- Extensive experience in using Informatica tool for implementation of ETL methodology in Data Extraction, Transformation and Loading.
- Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
- Extensively worked on Informatica IDQ for data profiling, data enrichment and standardization.
- Extensive hands on experience using Teradata utilities (SQL, BTEQ, Fast Load, Multi Load, Fast Export, TPT), UNIX in developing tools and system applications.
- Experienced in scheduling Sequence and parallel jobs using UNIX scripts and scheduling tool (Control - M v7/v8), UC4.
- Expertise in Extraction, Transformation & Loading of data using heterogeneous sources and targets.
- Expertise in creating very detailed design documents and performing Proof of Concepts (POC).
- Experience with good understanding of the concepts and handling Repository Manager, Designer and Informatica Admin Console.
- Involved in all phases of data warehouse project life cycle. Designed and developed ETL Architecture to load data from various sources like DB2 UDB, Oracle, Flat files, XML files, SFDC, Teradata, Sybase and MS SQL Server into Oracle, Teradata, XML, and SQL server targets.
- Hands on experience with NoSQL Databases like MongoDB, Cassandra
- Hands on experience with mappings from varied transformation logics like Unconnected and Connected Lookups, Router, Aggregator, Joiner, Update Strategy, Java Transformations and Re-usable transformations.
- Extracted Data from multiple operational sources of loading staging area, Data Warehouse and data marts using CDC/ SCD (Type1/Type2/Type3) loads.
- Extensively created mapplets, common functions, reusable transformations, look-ups for better usability.
- Created Jobs & Job plans for the Workflows, SQL jobs & Unix scripts by using UC4 tool.
- Good understanding of relational database management systems like Oracle, Teradata, DB2, SQL Server and worked on Data Integration using Informatica for the Extraction transformation and loading of data from various database source systems
- Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
- Expert in Oracle 11g/10g, IBM DB2 8.1, Sybase, VSAM, SQL Server 2008/2005, SQL, PL/SQL Stored procedures, functions, and exception handling using Toad and PLSQL.
- Worked onPerformance Tuning,identifyingandresolving performance bottlenecksin various levels like sources, targets, mappings and sessions.
- Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
- Experience in Cognos Report Studio to design Reports based on the requirements by the end user.
- Experience working in agile methodology and ability to manage change effectively.
- Responsible for Team Delivery and Participated in Design Reviews.
- Expertise in defining and documenting ETL Process Flow, Job Execution Sequence, Job Scheduling and Alerting Mechanisms using command line utilities.
- Excellent communication, interpersonal skills and quickly assimilate latest technologies concepts and ideas
- Outstanding communication and interpersonal skills, ability to learn quickly, good analytical reasoning and high compliance to new technologies and tools.
TECHNICAL SKILLS
ETL Tools: Informatica 9.5/9.1/8.6.1/8.1//7.1.2 (Power Center), IDQ, MDM, Informatica CC360, SQL Server SSIS,Datastage, Informatica CDI.
Data Modeling Tools: Erwin, MS Visio.
Databases: Teradata 15.1, Oracle 12g/11g/10g/9i/8i/, MS SQL Server 2008/2005, DB2 UDB, MS Access 2000,Cassandra,Mongo DB, Sybase
Others: Toad, SQL Navigator, Teradata SQL Assistant, SFDC, Cognos Report Studio, Python, Power exchange.
Environment: MS Windows 2008/2005, UNIX
Job Scheduling: Autosys, Shell Scripting, Control-M, UC4
PROFESSIONAL EXPERIENCE
Confidential, AL
ETL/Informatica Developer
Responsibilities:
- Worked with Data analysts in gathering requirements, discussing the issues to be resolved and translating user inputs into ETL design documents.
- Involved in profiling and analyzing data sets and provide an optimal solution for the business to meet their requirements.
- Designed and developed complex ETL processes using Informatica ETL and CDI to load data from source tables to the target tables.
- Developed as well as modified existing ETL code for enhancements of new business requirements.
- Designed and developed SQL stored Procedures and scripts.
- Performed Unit testing and integration testing on the module designed.
- Worked on performance tuning the legacy code to resolve the bottlenecks and decrease the run time of the code.
- Performed Data Quality checks to document and resolve them.
- Migrated ETL to higher environments.
- Worked to schedule the ETL code developed.
- Created IDQ mappings using informatica developer to and standardize address using address doctor.
- Validated Production data based on the business rules and resolve any issues identified.
- Worked on Production issues and resolve them.
- Coordinated with other teams for any failures and work on resolving them.
- Coordinated with other team members for any dependencies and worked with them to get it resolved.
Environment: Informatica Power Center 10.2, Teradata 16, SQL Server, Guidewire, Informatica CDI
Confidential, MO
ETL/Informatica Developer
Responsibilities:
- Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
- Responsible for ETL and coding standards.
- Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
- Used Erwin to create star and snow flake schema model diagrams.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
- Extracted data from source systems and loaded into Teradata Data Warehouse.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier,
- Aggregator and Update Strategy extensively.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Created Mapplets and used them in different Mappings..
- Used Teradata utilities fastload, multiload and Stored procedures to load data.
- Wrote Teradata Macros and used various Teradata analytic functions.
- Developed stored procedure to check source data with warehouse data and if not present, write the records to spool table and used spool table as lookup in transformation.
- Ensured MDM code conforms to established coding standards and meets the feature specification.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- Documented Cleansing Rules discovered from data cleansing and profiling.
- Managed Scheduling of Tasks to run any time without any operator intervention.
- Leveraged workflow manager for session management, database connection management and scheduling of jobs.
- Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
- Experience in Debugging and Performance tuning of targets, sources, mappings and sessions
- Experience in optimizing the Mappings and implementing the complex business rules by creating re- usable transformations and mapplets.
- Involved in migrating objects from DEV to QA and then promoting to Production.
- Involved in production support working with various tickets created while the users were working to retrieve the database.
- Performed Gap analysis to analyze the differences between the reports and target tables.
- Used SVN to migrate ETL to higher environment.
- Involved in preparing the mapping document for the QA team to test the ETL and the SQL.
- Validated and tested reports in Microstrategy by comparing it to the test scenarios and target data set.
- Delivered all the projects/assignments within specified timelines.
Environment: Informatica Power Center 10.1, Teradata, Flat files, One Automation
Confidential, Westchester, PA
ETL/Informatica DeveloperResponsibilities:
- Worked with Product owners and business users to understand and finalize the reporting requirements.
- Analysis of source systems and data profiling. Defined data transformations based on the data profiling
- Prototyped the functionality of the ETL and presented it to the architects for the approval.
- Documented the H.L.D and mapping document for the development. Also documented the testing scenarios and test cases.
- Worked with Architects/B.I to develop end-to-end ETL process
- Used Informatica to extract data from various source systems like oracle, flat files,DB2,SQL server, mainframe etc. and loaded it into the target Teradata tables.
- Used Teradata Parallel Transporter(TPT) connection for landing the data. Used error tables for routing the error rows from the source.
- Worked to setup the SFTP/FTP process to get the flat files from external vendors to our server.
- Used different transformations like Source qualifier, expression, joiner, filter, router etc. to develop mappings. Created sessions and workflows for the mappings.
- Used Pushdown optimization(PDO) in Informatica to load the data. Worked to handle scenarios and updated mappings to make them PDO compliant.
- Used Stonebranch tool for job scheduling.
- Hands on experience on UNIX scripting - developed shell scripts for archiving the files and deleting them based on the received date and type.
- Used temporal tables to maintain the history.
- Used power exchange to create data maps and import data from mainframe server to Informatica
- Worked on optimizing the job runtime by creating indexes, table partitions, collecting stats etc.
- Debugged and modified ETL in case of any bottle necks. Used Teradata view point and explain plan to understand and optimize queries.
- Used Fast load to move data between environments.
- Used Jenkins to migrate ETL from DEV to QA.
- Created Change requests to migrate ETL, DDL and schedule changes to PROD.
- Worked on SQL scripting or creating one time jobs to backfill data in PROD.
- Created validation SQL to verify the data in PROD.
- Worked with QA to come up with a test strategy, scenarios, data prep and testing the ETL.
- Worked on unit testing the ETL and documenting it.
- Used SAS for QA testing in some scenarios. Also worked on high level QA testing.
- Hands on experience on HP ALM for defect tracking and reporting.
- Used Informatica metadata manager tool for analyzing any workflow/table etc.
- Involved in Agile related ceremonies like daily sprint meeting, planning, refining, sprint reviews, retrospective etc.
Environment: Informatica Power Center 10.1,IDQ, Teradata 15.1 Oracle, SQL, Power exchange, DB2, VSAM, Mainframe, AQT, Teradata studio, Erwin, Shell Scripts, Unix.
Confidential, Dallas, Tx
ETL/Informatica Developer
Responsibilities:
- Conducted JAD sessions with business users and SME's for better understanding of the reporting requirements.
- Design and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts.
- Develop high level and detailed level technical and functional documents consisting of Detailed Design Documentation function test specification with use cases and unit test documents
- Analysis of source systems and work with business analysts to identify study and understand requirements and translate them into ETL code
- Created Jobs & Job plans for the Workflows, SQL jobs & Unix scripts by using UC4 tool.
- Worked with scheduler UC4 for scheduling Informatica Power Center Workflows Involved with Scheduling team in creating and scheduling jobs in Workload Scheduler.
- Handled technical and functional call across the teams.
- Responsible for the Extraction, Transformation and Loading (ETL) Architecture & Standards implementation
- Responsible for offshore Code delivery and review process
- Used Informatica to extract data from DB2, HL7, XML and Flat files to load the data into the target data bases like oracle, Teradata, DB2, Flat files.
- Worked on converting HL7 message to usable format as per the requirement.
- Worked in all phases of Data Integration from heterogeneous sources, legacy systems to Target Database.
- Worked with different EDI files like 837,834, 835, 270, 271.
- Worked on Informatica Power Center tool - Source Analyzer, Warehouse designer, Mapping and Mapplet Designer, Transformations, Informatica Repository Manager, Informatica Workflow Manager and Workflow Monitor.
- Worked Informatica Data Quality (IDQ) to verify the accuracy of data on Addresses and Contacts.
- Used IDQ for data profiling and cleansing functions.
- Involved in Design Review, code review, test review, and gave valuable suggestions.
- Worked with different Caches such as Index cache, Data cache, Lookup cache (Static, Dynamic and Persistence) and Join cache while developing the Mappings.
- Created partitions for parallel processing of data and also worked with DBAs to enhance the data load during production.
- Performance tuned informatica session, for large data files by increasing block size, data cache size, and target based commit
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2.
- Worked on migrating Mainframe code to informatica.
- Used Power Exchange to create data maps and move data to informatica.
- Worked on changes related to existing Electronic Health Records
- Involved in writing a procedure to check the up-to-date statistics on tables.
- Used Informatica command task to transfer the files to bridge server to send the file to third party vendor.
- Created Unix Shell Scripts for Informatica ETL tool to automate sessions.
- Took part in migration of jobs from UIT to SIT and to UAT
- Created FTP scripts and Conversion scripts to convert data into flat files to be used for Informatica sessions
- Involved in Informatica Code Migration across various Environments.
Environment: Informatica Power Center 10,IDQ,Facets, Oracle, Power exchange, DB2, VSAM, Mainframe, TOAD, Erwin, Shell Scripts, HL7, HIE,UC4 Unix.
Confidential, Naperville, IL
Sr. ETL/Informatica Developer
Responsibilities:
- Worked with Business Analyst and clients to finalize functional and detailed technical requirements
- Prepared the required application design documents based on functionality required.
- Designed the ETL processes using Informatica to load data from Oracle and Flat Files to staging database and from staging to the target database.
- Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
- Involved in the data migration team to build the Re-usableDatastagejob templates, common parameter sets, commonDatastagejob containers, SQL extract procedures and common re-usable shell scripts.
- Involved in creation of star schema and snowflake schema model diagrams using MS Visio.
- Worked with various sources like oracle, mainframe, VSAM, flat files, SFDC, DB2, Cassandra etc
- Integrated data from NoSQL databases to informatica.
- Extensively used Informatica and created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
- Designed and developed the logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
- Involved in cleansing and extraction of data and defined quality process for the warehouse.
- Involved in optimization of Informatica mappings and sessions using features like partitions, index cache to manage very large volume of data.
- Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
- Translated Business specifications into PL/SQL code. Extensively developed and fine-tuned Oracle Stored Procedures and triggers.
- Developed complex SQL code for SQL transformations and SQL over-rides.
- Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Developed IDQ plans to identify possible data issues.
- Worked with Informatica Data Quality (IDQ) Toolkit for analyzing, standardizing, cleansing, matching, conversion, exception handling, reporting and monitoring the data.
- Used Informatica Data Quality (IDQ) profiler and developer tool to analyze source system data and discover underlying issues.
- Converted data to unreadable format using data masking techniques.
- Configured and installed Informatica MDM Hub server, cleanse Server, Address doctor.
- Involved in loading data from different sources into Informatica MDM.
- Used Power exchange to create data maps, CDC Maps and import source Mainframe VSAM to Power center.
- Worked on data Extraction, Transformation and loading from source to target system using BTEQ, FastLoad, and MultiLoad.
- Teradataperformance tuning via Explain, PPI, AJI, Indices, collect statistics or rewriting of the code.
- Worked on creation of views, materialized views using PL/SQL for the data warehouse.
- Defects are logged and change requests are submitted using Task module of Change Director using HEAT (Help Ticket System)
- Worked with different Informatica tuning issues to find bottlenecks in the mappings and fine-tuned them to make them more efficient in terms of performance.
- Performed data validation, reconciliation and error handling in the load process. Test data to ensure data is masked.
- Worked on unit testing the code and designed Unit test documents.
- Designed and developed test cases for integration testing.
- Involved in job scheduling using Autosys.
- Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
- Involved in migrating objects from DEV to Testing and FVT and then promoting to Production.
- Involved in production support working with various tickets created while the users were working to retrieve the database.
- Worked with the end users to gather the business requirements for designing the reports
- Worked on Cognos Report studio to generate Complex Reports based on the requirements.
- Designed Prompt pages for the reports.
- Designed drilldown, crosstab, list reports.
Environment: Informatica Power Center 9.5, Business Objects, Teradata, Oracle 11g/12g, TOAD, Erwin, SQL, PL/SQL, XML, HP UNIX, Test Director/Quality Center,Mulesoft, Autosys, SFDC,CC360, Python, Cognos Report Studio, MDM, Oracle CCB, Datastage.
Confidential, Dallas, TX
Sr. ETL/Informatica Developer
Responsibilities:
- Worked with Business Analyst and application users to finalize Data Model, functional and detailed technical requirements.
- Responsible for Data Warehouse Architecture, ETL and coding standards.
- Developed Capacity Planning/Architecture/ Strategic Roadmaps/Implementing standards.
- Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into databases.
- Used Erwin to create star and snow flake schema model diagrams.
- Created detailed Technical specifications for Data Warehouse and ETL processes.
- Conducted a series of discussions with team members to convert Business rules into Informatica mappings.
- Created dictionaries using Informatica Data Quality (IDQ) that was used to cleanse and standardized Data. Worked with Informatica and other consultants to develop IDQ plans to identify possible data issues.
- Extracted data from SAP R/3 and loaded into Oracle Data Warehouse.
- Used Transformations like Look up, Router, Filter, Joiner, Stored Procedure, Source Qualifier, Aggregator and Update Strategy extensively.
- Tuned performance of Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval.
- Created Mapplets and used them in different Mappings.
- Used data masking techniques to encrypt the data
- Migrated legacy mainframe code to informatica.
- Created CDC maps to import VSAM files to informatica.
- Normalized the VSAM data and handled the logic in informatica.
- UsedTeradatautilities fastload, multiload, tpump to load data.
- WroteTeradataMacros and used variousTeradataanalytic functions.
- Developed stored procedure to check source data with warehouse data and if not present, write the records to spool table and used spool table as lookup in transformation.
- Done extensive bulk loading into the target using Oracle SQL Loader.
- Designed and developed Mappings for loading MDM HUB
- Ensured MDM code conforms to established coding standards and meets the feature specification.
- Did Application tuning and Disk I/O tuning to enhance the performance of the system.
- Involved in doing error handling, debugging and troubleshooting Sessions using the Session logs, Debugger and Workflow Monitor.
- Documented Cleansing Rules discovered from data cleansing and profiling.
- Managed Scheduling of Tasks to run any time without any operator intervention.
- Leveraged workflow manager for session management, database connection management and scheduling of jobs.
- Created Unix Shell Scripts for Informatica ETL tool to automate sessions and cleansing the source data.
- Experience in Debugging and Performance tuning of targets, sources, mappings and sessions
- Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations and mapplets.
- Delivered all the projects/assignments within specified timelines.
Environment: Informatica Power Center 9.5, Power Exchange, SAP R/3, Teradata, Flat files, MS SQL server 2008, DB2 8.0, Erwin, Winscp, Control-M, Erwin, MDM,Sqoop, Mongo DB, SFDC, Mercury Quality Center, Shell Script, UNIX.
Confidential, Phoenix, AZ
Sr. ETL Developer
Responsibilities:
- Prepared the required application design documents based on functionality required
- Designed the ETL processes using Informatica to load data from Oracle, DB2 and Flat Files to staging database and from staging to the target database.
- Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
- Involved in migration of mappings and sessions from development repository to production repository
- Extensively used Informatica and created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator.
- Designed and developed the logic for handling slowly changing dimension tables load by flagging the record using update strategy for populating the desired.
- Involved in cleansing and extraction of data and defined quality process for the warehouse.
- Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions, index cache to manage very large volume of data.
- Created Stored Procedures to transform the Data and worked extensively in PL/SQL for various needs of the transformations while loading the data.
- Translated Business specifications into PL/SQL code. Extensively developed and fine-tuned Oracle Stored Procedures and triggers.
- Used Update Strategies for cleansing, updating and adding data to the existing processes in the warehouse.
- Defects are logged and change requests are submitted using defects module of Test Director using HP Quality Center
- Worked with different Informatica tuning issues and fine-tuned the transformations to make them more efficient in terms of performance.
- Involved in Unit testing, User Acceptance testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
- Involved in migrating objects from DEV to QA and testing them and then promoting to Production
- Involved in production support working with various tickets created while the users working to retrieve the database.
Environment: Informatica Power Center, Business Objects, Oracle 10g, TOAD, Erwin, SQL, PL/SQL, XML, HP UNIX, Test Director/Quality Center
Confidential, Dallas, TX
Informatica Developer
Responsibilities:
- Imported various Sources, Targets, and Transformations using Informatica Power Center Server Manager, Repository Manager and Designer.
- Created and managed the global and local repositories and permissions using Repository Manager in Oracle Database.
- Responsibilities included source system analysis, data transformation, loading, validation for data marts, operational data store and data warehouse.
- Used heterogeneous files from Oracle, Flat files and SQL server as source and imported stored procedures from oracle for transformations.
- Designed and coded maps, which extracted data from existing, source systems into the data warehouse.
- Used Dimensional Modeling Techniques to create Dimensions, Cubes and Fact tables.
- Written PL/SQL procedures for processing business logic in the database. Tuned SQL queries for better performance.
- Scheduled Sessions and Batch Process based on demand, run on time, run only once using Informatica Server Manager.
- Generated completion messages and status reports using Informatica Server manager.
- Tuned ETL procedures and STAR schemas to optimize load and query Performance.
- Starting Sessions and Batches and make Event based scheduling
- Managed migration in a multi-vendor supported Server and Database environments.
Environment: Informatica Power Center 7.1.2, DB2 v8.0, SQL, Windows 2000, UNIX, SQL Server 2000, Oracle 8i, Flat files, SQL *Plus, Business Objects 5.1.6