Lead Etl. Informatica/idq Developer Resume
FloridA
PROFESSIONAL EXPERIENCE:
- 8+ years of IT experience in Informatica with a strong background in ETL development using Informatica tools PowerCenter, Data Quality, informatica cloud.
- Expert knowledge in working with Informatica Power Center 9.x/8.x/7.x (Designer, Repository manager, Repository Server Administrator console, Server Manager, Work flow manager, workflow monitor) and Knowledge of Informatica 7.x/8.x/9.x in installation and setup.
- Hands on experience with Informatica Data Explorer (IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
- Helped with data profiling, specifying and validating rules (Scorecards), and monitoring data quality using the Informatica Analyst tool.
- Worked on Power Exchange for change data capture (CDC).
- Have extensively worked in developing ETL for supporting Data Extraction, transformations and loading using Informatica Power Center.
- Experience in Data Warehouse/Data mart, ODS, OLTP and OLAP implementations teamed with project scope, Analysis, requirements gathering, data modeling, ETL Design, development, System testing, Implementation and production support.
- Solid expertise in, Informatica ETL, Data Profiling, IDQ Data Quality
- Worked with Informatica Data Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1
- Good knowledge of Data warehouse concepts and principles (Kimball/Inman) - Star Schema, Snowflake, Enterprise Data Vault, SCD, Surrogate keys, Normalization/De normalization.
- Experience in integration of various data sources with Relational Databases like Oracle, SQL Server and Worked on integrating data from flat files.
- Experience in profiling data using Informatica Data Quality 9.5.1(Analyst tool).
- Extensive experience in using various Informatica Designer Tools like Source Analyzer, Mapping Designer, Mapplet Designer, Schedulers and Warehouse Designer.
- Expertise in Data Warehousing/ETL programming and Fulfillment of data warehouse project tasks such as data extraction, cleansing, transforming and loading.
- Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, Normalizer, Union, and XML Source Qualifier.
- Highly experienced in developing, designing, reviewing and documenting Informatica work products like Mappings, Mapplets, Reusable transformations, Sessions, Workflows, Worklets, Schedulers and experienced in using Mapping parameters, Mapping variables, Session parameter files.
- Proficient in ETL (Extract - Transform - Load) using SQL Server Integration Services 2005 (SSIS) and Informatica Power Center tool.
- Extensive experience in error handling and problem fixing in Informatica.
- Designed complex Mappings and expertise in Performance Tuning.
- Experience in troubleshooting by tuning mappings, identify and resolve performance bottlenecks in various levels like source, target, mappings, and session.
- Involved in Implementing Slowly Changing Dimensions, Star Join Schema modeling, Snowflake modeling, FACT tables, Dimension tables, denormalization.
- Experience in SQL Server, Teradata, and DB2 etc.
- Experience in UNIX (AIX/Solaris/ HP-UX 11.x), and Windows Operating system.
- Extensively used SQL and PL/SQL to write Stored Procedures, Functions, and Packages.
- Excellent overall software Development life cycle (SDLC) experience, Conceptual and Logical Thinking, Goal oriented, self-motivated and able to work independently and as a member of a team.
- Excellent communication, analytical and interpersonal skills.
- Excellent experience using Teradata SQL Assistant, data load/export utilities like BTEQ, Fast Load, Multi Load.
- Quick learner and adaptive to new and challenging technological environments
TECHNOLOGIES:
ETL/IDQ Tools: Informatica Power Center/Data Quality (IDQ)/Power Exchange 9.6.1HF1-7.1, SSIS, SSRS
Database: Oracle 11g-8i, SQL-Server 2005/2008r, DB2, Teradata, MySQL.
Reporting Tools: OBIEE, Micro Strategy.
GUI Tools: SQL-Developer, TOAD 9.5, SQL*Plus, SQL Loader, XML Publisher, IIR, Web Services (WSDL), SOAP, IDE, JIRA, SAP, Putty, WinSCP, COBOL, BTEQ.
Languages: SQL, PL/SQL, T-SQL, XML, Unix Shell Scripting.
Operating Systems: Windows 93/95/98/ME/NT/XP/ 7/8, Vista, Unix, Mac.
PROFESSIONAL EXPERIENCE:
Confidential, FloridaLead ETL. Informatica/IDQ Developer
Responsibilities:
- Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
- Perform System Analysis and Requirement Analysis, design and write technical documents and test plans.
- Created a Hybrid process in IDQ by combining both IDQ Developer and analyst version through LDO (Logical Design Objects)
- Worked on IDQ Analyst for Profiling, Creating rules on Profiling and Scorecards.
- Worked with Management for creating the requirements and estimates on the Project.
- Assisted Business Analyst with drafting the requirements, implementing design and development of various components of ETL for various applications.
- Coordinated with DBA in creating and managing tables, indexes, table spaces and data quality checks.
- Used Teradata external loading utilities like Multi Load, TPUMP, Fast Load and Fast Export to extract from and load effectively into Teradata database
- Used Informatica PowerCenter 9.1 for Data Extraction, loading and transformation (ETL) of data into the target systems.
- Designed IDQ mappings which is used as Mapplets in Power center.
- Developed numerous mappings using the various transformations including Address Validator, Association, Case Converter, Classifier, Comparison, Consolidation, Match, Merge, Parser etc.
- Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
- Created Complex ETL Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, Dynamic Lookup, and Connected and unconnected lookups, Filters, Sequence, Router and Update Strategy.
- Creating Informatica mappings for populating the data into dimension tables and fact tables from ODS tables.
- Extensive work in SSRS, SSIS, MS SQL Server, SQL Programming and MS Access.
- Using the Power Exchange for Change Data capture (CDC) option to capture the data whenever the inserts, updates, and deletes underlying these events as soon as they occur and transfer in to multiple targets without intermediate queues or staging tables
- Design Conceptual, Logical, Physical Data models and implementation of Operational Data Store/ Data Warehouse/ Data Mart.
- Performing standard IDQ functions including data profiling, data cleansing and address
- Identified the bottlenecks in the sources, targets, mappings, sessions and resolved the problems.
- Implemented Slowly Changing Dimensions both SCD Type 1 & SCD Type 2.
- Developed Test Scripts, Test Cases, and SQL QA Scripts to perform Unit Testing, System Testing.
- Responsible for authorizing technical documentation and performing reviews, design, data manipulation, and creative problem solving
- Involved in analyzing different modules of FACETS system and EDI interfaces to understand the source system and source data.
- Communicate with business users to understand their problem if required and send workaround to solve the issue and find the root cause and advice if required any enhancement.
- Responsible for tuning of ETL processes.
- Involved in writing Teradata SQL bulk programs and in Performance tuning activities for Teradata SQL statements using Teradata EXPLAIN
- Design and develop mappings, sessions and workflow as per the requirement and standards.
- Responsible for making changes in the existing configuration wherever required, making customizations according to the business requirements, testing and successfully moving the changes into production.
Environment: Informatica 9x, Informatica Data Quality (IDQ) 9.6.1, Teradata, Oracle 11g, JDE, Power Exchange, Microsoft Visual Studio, PL/SQL, SSIS/SSRS,XML.
ConfidentialWest Chester, PA
Sr. ETL. Informatica/IDQ Developer
Responsibilities:
- Analyzed the business requirements and framing the business logic for the ETL process.
- Designed and Developed complex mappings, reusable Transformations for ETL using InformaticaPower Center 9.6.
- Designed the ETL processes using Informatica to load data from SQLServer, FlatFiles, XMLFiles and Excel files to target Oracle database.
- Created complex SCD type 1 & type 2 mappings using dynamic lookup, Joiner, Router, Union, Expression and Update Transformations.
- Used Teradata external loading utilities like Multi Load, TPUMP, Fast Load and Fast Export to extract from and load effectively into Teradata database
- Developing complex ETL mappings for Staging, Operational Data Source (ODS), Data warehouse and Data marts load.
- Performed data manipulations using various Informatica Transformations like Aggregate, Filter, Update Strategy, and Sequence Generator etc.
- Responsibilities included designing and developing complex mappings using Informatica power center and Informatica developer (IDQ) and extensively worked on Address Validator transformation in Informatica developer (IDQ)
- Designed and Developed ETL strategy to populate the Data Warehouse from various source systems such as Oracle, Teradata, Netezza, Flat files, XML, SQL Server
- Worked in Service Oriented Architecture (SOA) and familiar with transact oriented data warehouse where data is updated and fetched real time. Worked on change data capture (CDC) while loading Dimensions by implementing SCD type 1 and type 2 from complex data extractions.
- Used Address Validator transformation in IDQ and passed the partial address and populated the full address in the target table and created mappings in Informatica Developer (IDQ) using Parser, Standardizer and Address Validator Transformations.
- Extensively used Change data capture concept in Informatica as well as in the OracleDatabase to capture the changes to the datamart. Change data capture enabled us to reduce the time taken to load the data into the data mart by allowing only the changed data.
- Using TeradataExplain, PMON to analyze and improve query performance.
- Working on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database and created Informatica parameter files and User Defined Functions for handling special characters.
- Processing claims through EDI 837 files to FACETS system and also worked on scenarios for complete claims lifecycle.
- Designed informatica mappings to publish a daily and monthly events files via ESB to be consumed by downstream systems, and to be used for transformation into legacy ECF file format.
- Worked on different file formats like Sequence files, XML files and Map files using Map Reduce Programs.
- Involved in writing Teradata SQL bulk programs and in Performance tuning activities for TeradataSQL statements using Teradata EXPLAIN
- Worked on Informatica PowerCenter Designer - Source Analyzer, Warehouse Designer, Mapping Designer &Mapplet Designer and Transformation Developer
- Written SQL overrides in source Qualifier according to business requirements and Created Oracle Tables, Views, Materialized views and PL/SQL stored procedures and functions
- Analyzed WSDL interactions with Oracle on Demand to push as well as pull data via Informatica.
- Generated UNIX shell scripts for automating daily load processes and scheduled and unscheduled workflows and used UNIX command tasks to automate the entire process of fetching the source file from a different path and FTP it onto the server.
- Estimates and planning of development work using Agile Software Development
- Involved in writing ETL specifications and unit test plans for the mappings and performed Developer testing, Functional testing, Unit testing for the Informatica mappings.
Environment: Informatica Power Center 9.6, DB2, Informatica Cloud, Erwin 9.x, UNIX Shell Scripting, Oracle 12c, PL/SQL, Business Objects XI R2, SQL Server 2012, Korn Shell Scripting, Teradata,SQL, T-SQL, Microsoft Visual Studio 2014,Teradata SQL Assistant, Tivoli Workload Scheduler 8.4, Tidal Job Scheduler, SSRS, SSIS, TOAD 9.7.2, Crystal Reports 11
ConfidentialAtlanta, GA
Sr. ETL Informatica / IDQ Developer
Responsibilities: -
- Involved in requirement analysis, ETL design and development for extracting data from the heterogeneous source systems like MS SQL Server, Oracle, flat files, XML files and loading into Staging and Enterprise Data Vault.
- Responsible for converting Functional Requirements into Technical Specifications.
- Identified facts and dimensions from the source system and business requirements to be used for the data warehouse.
- Design, developed and Unit tested SQL views using Teradata SQL to load data from source to target.
- Estimates and planning of development work using Agile Software Development.
- Involved in analyzing the ICD 9 and ICD 10 for the data mapping from ICD 9 - ICD 10 and ICD 10 - ICD 9 in source and target level.
- Identified and eliminated duplicates in datasets thorough IDQ 8.6.1 components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
- Involved in massive data profiling using IDQ (Analyst tool) prior to data staging.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Interpreted logical and physical data models for Business users to determine common data definitions and establish referential integrity of the system.
- Involved in analyzing different modules of FACETS system and EDI interfaces to understand the source system and source data.
- Used Informatica PowerCenter 9.6.1 for Data Extraction, Transformation and Loading data from heterogeneous source systems into the target data base.
- Imported Data from various sources FLAT FILE, SQL SERVER and ORACLE.
- Worked on the Enterprise Datawarehouse methodologies (Landing, Staging, publishing and promoting the codes to Test regions).
- Extracted and Loaded data into Teradata using Fastload and Mload.
- Used SQL Server SSIS tool to build high performance data integration solutions including extraction, transformation and load packages for dataware housing. Extracted data from the XML file and loaded it into the database
- Reviewed mapping documents provided by Business Team, implemented business logic embedded in mapping documents into Teradata SQLs and loading tables needed for Data Validation.
- Used Hierarchies tool for configuring entity base objects, entity types, relationship base objects, relationship types, profiles, put and display packages and used the entity types as subject areas in IDD.
- Worked extensively on the HUB's, SATELLITE's, LINK's, LSAT's, HLINK's AND RSAT's ETL mappings.
- Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements.
- Involved in Performance Tuning in Informatica for source, transformation, targets, mapping and session.
- Implemented Slowly Changing Dimensions Type-1, Type-2 approach for loading the target tables.
- Designed various tasks using Informatica workflow manager like session, command, email, event raise, event wait etc.
- Good knowledge on the Enterprise Data Vault architecture.
- Created various data marts from data warehouse and generated reports using Cognos.
- Imported metadata from different sources such as Relational Databases, XML Sources.
Environment: Informatica 9.5/9.5.1, Teradata,Oracle 11g, SQL Server, Cognos 8.1/8.2/8.4/10.1.0 , Informatica IDQ Analyst tool, SSIS/SSRS,IDE, Metadata Manager.
ConfidentialRichardson, TX
Sr ETL. Informatica /IDQ Developer
Responsibilities: -
- Developing new business applications and enhance existing business applications using Informatica Power Center, Oracle SQL and Shell Scripting.
- Write detailed description of user needs, program function and steps required to develop implement applications to live environments.
- Prepare Technical Designs, Test case documentation, Data Load strategy, and Operation strategy for all data feeds.
- Trouble-shoot production problems.
- Experience in working with business analysts to identify study and understand requirements and translated them into ETL code in Requirement Analysis phase.
- Experience in creating High Level Design and Detailed Design in the Design phase.
- Expertise in Business Model development with Dimensions, Hierarchies, Measures, Partitioning, Aggregation Rules, Time Series, Cache Management.
- Understanding the business requirements, developing design specifications for enterprise applications using Teradata.
- Implemented Performance Tuning of ETL maps at mapping, session, source and target level as well as writing complex SQL Queries from Abstract data models.
- Using the Power Exchange for Change Data capture (CDC) option to capture the data whenever the inserts, updates, and deletes underlying these events as soon as they occur and transfer in to multiple targets without intermediate queues or staging tables
- Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
- Strong knowledge of Entity-Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
- Experience in integration of various data sources like Oracle, DB2, SQL server and MS access and non-relational sources like flat files into staging area.
- Populate or refresh Teradata tables using Fast load, Multi load & Fast export utilities for user Acceptance testing.
- Experience in creating Reusable Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
- Data Analyzing and Data Profiling of the source data.
- Experience in creating Reusable Tasks (Sessions, Command, Email) and Non-Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
- Involved in creating, monitoring, modifying, & communicating the project plan with other team members.
- Design, developed and Unit tested SQL views using Teradata SQL to load data from source to target.
- Hands on experience with Informatica Data Explorer(IDE) / Informatica Data Quality (IDQ) tools for Data Analysis / Data Profiling and Data Governance.
- Worked with Informatica Data Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.6.1
- Utilized Informatica IDQ 9.6.1 to complete initial data profiling and matching/removing duplicate data.
- Tuned mappings using Power Center-Designer and used different logic to provide maximum efficiency and performance.
- Experienced in UNIX work environment, file transfers, job scheduling and error handling.
- Extensively worked on developing and debugging Informatica mappings, mapplets, sessions and workflows.
- Used Informatica PowerCenter 9.1 for Data Extraction, loading and transformation (ETL) of data into the target systems.
- Responsible for verifying accuracy of data, testing methods, maintenance and support of the data warehouse.
- Worked on Performance Tuning, identifying and resolving performance Bottlenecks in various levels like sources, targets, mappings and sessions.
- Involved in Unit testing, System testing to check whether the data loads into targets are accurate.
- Experience in support and knowledge transfer to the production team.
- Proficient in interaction with the business users by conducting meetings with the clients in Requirements Analysis phase.
- Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using AGILE methodologies.
- Extensive functional and technical exposure. Experience working on high-visibility projects.
Environment: Informatica 9x, Informatica Data Quality (IDQ) 9.0.1,9.5.1 and 9.6.1Oracle 11g,Teradata, Power Exchange, JDE, SQL Server, Tidal, PL/SQL, SQL*Plus, SQL*Loader, XML, Microsoft Visual Studio, Windows XP Professional, FTP, MS-Excel, MS-Access.
ConfidentialGreensboro, NC
Sr. ETL Informatica IDQ Developer
Responsibilities: -
- Interacted with Business analysts to collect the Business requirements and understand the Usage of the project.
- Working extensively on Informatica Big Data Edition.
- Working on Hadoop files and Tables as Target definitions.
- Resolving user queries before the SLA with accurate solutions.
- Coordinating with various source teams regarding any data issues in production.
- Developed Mappings/Workflows/Scheduling ETL process.
- Implementing CDC using Set max variable approach.
- Used most of the transformations such as the Source qualifier, Aggregator, Router, Filter, Sequence Generator, Stored Procedure, Expression and Lookup as per the business requirement.
- Worked on enhancements like developing new mapping or changes to the existing code.
- Implemented performance tuning and optimization techniques.
- Create clear concise documentation of requirements to review between the technical and business group.
- Production implementation coordination between the users and technology area Analyzed defined and managed Business needs.
- Experience working with Informatica Analyst and Informatica Developer tools.
- Worked on creating profiles, profile models, rules and mappings with IDQ.
- Created Profiles using IDQ rules and filters.
- Extraction, Transformation and Loading (ETL) of data by using Informatica Power Center.
- Applying cleansing rules using developer.
- Developed several complex mappings in Informatica a variety of Power Center transformations, Mapping Parameters, Mapping Variables, Mapplets & Parameter files in Mapping Designer using both the Informatica Power Center and IDQ.
- Analyzing the queries in case of any long running jobs and collecting the stats for the tables involved in it.
- Implemented Join conditions between tables at the Database level.
- Created ETL (Extract Transform and Load) specification documents based on the business Requirements.
- Handling the monthly release activities on time to cope up the work.
- Worked extensively on developer client in Hadoop environment in order to maintain files and tables in Hive environment.
- Using TIDAL as scheduler for all the jobs and working with relevant UNIX scripts
- Developed Source to Target data mapping documents, inclusive of all the transformation and business rules, logical and physical column names, data types, data flow diagrams used for ETL Design and development.
- Created Source to Target (ETL) mapping documents which included the fields, data type, definitions from both source and target systems. A business/transformation rule was also documented based on the business requirement or formatting needs of data.
- Worked on Netezza database for loading data into data warehouse.
- Provided strategic support in development of detailed project plans, work assignments, target dates etc.
Environment: Informatica Analyst tool/IDE 9.6.1, Informatica Data Quality 9.6.1, Teradata, JDE,Informatica Power center 9.6.1, Oracle, Tidal Scheduler.
Confidential
Sr. ETL / IDQ Developer- Informatica.
Responsibilities: -
- Participated in team meetings and proposed ETL strategy based on Agile Methodology.
- Worked on Informatica PowerCenter 8.6.1-9.1.0. HF1 Tools- Repository Manager, Informatica Designer, Workflow Manager/ Monitor, Informatica Data Quality(IDQ) Developer and Analyst Toolkits.
- Based on Subject Areas, provided concrete solutions for complex/critical Mappings and created various complex Mappings in different layers. Successfully implemented SCD Type1/ Type 2 for insert capture data changes and delete operation to maintain the data history. Created Mapping & Sessions Variables/Parameters, Parameters files, Mapplets & Reusable Transformations to reuse during life cycle development.
- Created batches based on Subject Areas for different layers to run Workflows/Worklets and Sessions, scheduled the Workflows for load frequencies and configured them to load data.
- Involved in debugging the invalid Mappings. Tasted Mappings, Sessions, and Workflows to figure out the Bottlenecks and tuned them for better performance. Built Unit test queries to test data accuracy.
- Migrated the codes from Development to Test, Test to Production. Created effective Unit, System, Integration test of data on different layers to capture the data discrepancies/inaccuracies to ensure the successful execution of accurate data loading. Created technical documentations for each Mapping for future developments.
- Designed and coded change request as per the new requirements. Created Pre & Post-Sessions UNIX Scripts, Stored Procedures to drop & re-create the indexes and to solve the complex calculation.
- Worked with Informatica Data Quality (IDQ) 9.5.1 Developer/Analyst Tools to remove the noise of data using different transformations like Standardization, Merge and Match, Case Conversion, Consolidation, Parser, Labeler, Address Validation, Key Generator, Lookup, Decision etc.
- Processed large sets of structured, semi-structured and unstructured data and supporting systems application using Big Data - Hadoop
- Assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review with Big Data- Hadoop.
- Created Reference/Master data for profiling using IDQ Analyst tools. Used the Address Doctor Geo-coding table to validate the address and performed exception handling, reporting and monitoring the data.
- Built the Physical Data Objects and developed various mapping, mapplets/rules using the Informatica Data Quality (IDQ) based on requirements to profile, validate and cleanse the data. Identified and eliminated duplicate datasets and performed Columns, Primary Key, Foreign Key profiling using IDQ 9.5.1.
Environments:-Informatica Power Center 9.1.0/8.6.1 , Teradata, Oracle 11g, PL/SQL,UNIX, Toad 9.5, Dynamic SShell Scripting. Web Services (WSDL), SQL Navigator, IDQ 9.5.1, IIR, BTEQ, Big data-Hadoop, Hive.
ConfidentialInformatica Developer
Responsibilities:
- Translated business requirements into data warehouse design.
- Designed and maintained logical and physical enterprise data warehouse schemas using Erwin
- Extracted Erwin physical models into repository manager using Informatica Power plug.
- Used Star Schema approach for designing of Data Warehouse.
- Integrated different systems using Informatica server configuration. Extracted data from Flat files, Oracle, SQL Server, MS-Access and Loaded data into Oracle using Informatica.
- Created Source definitions, Target definitions, Mappings, Transformations, Reusable Transformations, Mapplets using Informatica Designer tool which includes Source Analyzer, Warehouse Designer, Transformation Developer, Mapping Designer and Mapplet Designer.
- Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, rank, update strategy, lookup, stored procedure, and sequence generator, joiner Transformation.
- Created sessions, database connections and batches, Scheduled and monitored transformation processes using Informatica Server Manager.
- Used Informatica repository manager to backup and migrate metadata in development, test and production systems.
Environment: Informatica Power Center 6.2/7.1, Cognos Impromptu 6.0, Impromptu web reports (IWR 6.0), UNIX, IIS 4.0, PL/SQL, Oracle 8.0/8i and Win NT.
Confidential
Sr. ETL Developer- Informatica.
Responsibilities: -
- Worked on data Integration from different sources using Informatica PowerCenter/Power Exchange 8.6.1/9.1 Tools- Repository Manager, Informatica Designer, Workflow Manager/ Monitor and Upgraded the system from 8.6.1 to 9.1. Participated in team meetings and proposed ETL strategies.
- Developed various complex Mappings and successfully implemented SCD Type1/Type 2 to keep the data history changes. Designed and coded change request as per the new requirements.
- Migrated the codes from Repository to Repository. Used debugger to validate the Mappings and gained troubleshooting information about the data and error conditions. Involved in fixing the invalid Mappings. Tested the Mappings, Sessions, Workflows and Worklets. Wrote the test queries to check if the data was loading to dimension tables and fact tables properly.
- Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application using Big Data - Hadoop
- Created and reviewed the logical and physical data model for the fact and dimension tables according to business requirements to be used for EDW. Created DDL scripts to implement Data Model changes. Created ERwin reports in HTML, RTF format depending upon the requirements, published Data Model in Model Mart, co-coordinated with DBAs to apply the data model changes.
- Created effective Unit, System and Integration Test cases for various stages of ETL to capture the data discrepancies and inaccuracies to ensure the successful execution of accurate data loading.
- Wrote Functions, Stored Procedures to drop & re-create the indexes and to solve the complex calculation as needed. Tested and maintained data integrity among various Sources and Targets.
- Worked on Performance Tuning to tune the data loading by identifying the Bottlenecks in Sources, Targets, Mappings, Transformations, Sessions, Database, Network then fixing them.
- Involved in providing Informatica Technical Support to the team members, as well as the business.
Environments:-Informatica PowerCenter/Power Exchange 8.6.1/9.1, Teradata, Oracle 10g, MS-Server 2008, PL/SQL, ERwin 8.2, Toad 9.5, Putty, WinSCP, UNIX, Web Services (WSDL), BTEQ, MySQL.