Etl Developer Resume Profile
SUMMARY:-
- 7 years of IT experience in analysis, design, development, implementation and troubleshooting in the area of Data Warehousing solutions using Informatica power center in telecommunications, banking, finanacial, pharma, insurance and retail sectors.
- q Highly proficient in Data Warehousing ETL using Informatica Power Center 6.1./7.1/8.1/8.6/9.1.0, Oracle 11g/10g/9i/8.x/7.x, Business Objects 5.0, Cognos Impromptu 5.0.
- Expertise in Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, Star Schema, Snow Flake Schema, Fact Table, Dimension Table, Logical Data Modeling, Physical Modeling, Dimension Data Modeling, multidimensional modeling, Data profiling and data cleansing.
- Worked on Repository Server Administration Console, Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Normalizer, Union and XML Source Qualifier.
- Extensively used Repository Manager, Workflow Manager, Workflow Monitor and worked on Informatica Designer Components like Source analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designers.
- Extensive experience in using Informatica tool, SSIS package for implementation of ETL methodology in Data Extraction, Transformation and Loading.
- Expertise in Developing Mappings, Defining Workflows Tasks, Monitoring Sessions, Export Import Mappings and Workflows and Backup Recovery.
- Extensive experience in extraction, transformation and loading of data directly from different heterogeneous source systems like Flat files Fixed width Delimited , XML Files, COBOL files, VSAM, IBM DB2 UDB, Excel, Oracle, Sybase, MS SQL Server, Teradata and Netezza.
- Experience with Teradata utilities like Fast load, Fast Export, Multi Load, TPUMP TPT. Have experience in creating BTEQ scripts.
- Experience in using scheduling tools TWS Tivoli Job Scheduler , AutoSys to automate running of Informatica Workflows on a daily Batch Jobs.
- Sound knowledge in tuning Informatica mappings and identifying bottlenecks and resolving the issues to improve the performance of the data loads and extracts
- Well Experienced in doing Error Handling and Troubleshooting using various log files.
- Extensive knowledge of SQL, PL/SQL and Unix Scripting.
- Extensively worked with Oracle PL/SQL Stored Procedures, Triggers, Functions, and Packages and also involved in Query Optimization.
- Experienced in UNIX work environment, file transfers, job scheduling and error handling.
- Strong skills in data analysis, requirement analysis and data mapping for ETL processes.
- Experience in Creating and maintaining Database Objects like Tables, Views, Materialized views, Indexes, Constraints, Sequence, Table Partitions, Synonyms and Database Link.
- Good knowledge in interacting with Informatica Data Explorer IDE , and Informatica Data Quality IDQ .
- Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
- Experience with industry standard methodologies like Waterfall, Agile, and Scrum methodology within the Software Development Life Cycle SDLC .
- Ability to learn new concepts together with excellent interpersonal skills.
- Excellent skills in understanding business needs and converting them into technical solutions.
- Excellent problem solving skills with a strong technical background and result oriented team player with excellent communication and interpersonal skills.
PROFESSIONAL EXPERIENCE:-
Confidential
Informatica Developer
Responsibilities:
- Interacted with Business Analyst to understand the requirements and created Business Requirement Specification Document.
- Coordinated with Informatica Administrator/Architect to setup the environment and to move objects to different repositories and folders.
- Extracted data from different heterogeneous sources like Oracle, XML, SQL Server, Flat files and transformed the data according to the business requirement and then loaded it into the target Teradata database.
- Analyzed data sources like Oracle, Flat files including Delimited and Fixed width like text files, XML files from which the contract data and billing data is coming and understand the relationships by analyzing the OLTP Sources and loaded into Teradata warehouse.
- Used FastLoad and MultiLoad Types to load the bulk data into the Teradata Database Tables and also used Teradata Parallel Transporter TPT for loading pipe delimited, comma delimited and fixed-width data files onto Teradata.
- Designed SSIS Packages to transfer data from flat files using Business Intelligence Development Studio.
- Used SSIS to develop jobs for extracting, cleaning, transforming and loading data into data warehouse.
- Defined the content, structures and quality of high complex data structures using Informatica Data Explorer IDE .
- Designed mappings using different transformations like Lookup connected, Unconnected, Dynamic , Expression, Sorter, Joiner, Router, Union, Transaction Control, Update strategy, Normalizer, Filter, Rank, and Aggregator using Informatica Power center designer.
- Developed Index Cache and Data Cache in cache using transformation like Rank, Lookup, Joiner, and aggregator Transformations.
- Developed reusable transformations, mapplets to implement the common business logic according to the requirement.
- Worked on Slowly Changing Dimension SCD Type1 and SCD Type2 to maintain customer's full history.
- Implemented Change Data Capture CDC to extract information and Partitioned sessions for concurrent loading of data into the target tables.
- Worked on the Database Objects including Triggers, Stored Procedures, Functions and Database Constraints.
- Responsible for performance optimization by writing SQL overrides instead of using transformations, implementing active transformation like filter as early as possible in the mapping, selecting sorted input when using Aggregator or Joiner transformations
- Created and configured workflows, worklets and sessions using Informatica Workflow Manager.
- Written post SQL and pre SQL at session level.
- Converted workflows with Relational connections to mload connections to connect to Teradata which drastically reduced the overall workflow run-time and improved the performance.
- Used Informatica Version Control for checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production environment.
- Worked on Data Cleansing and Standardization using the cleanse functions in Informatica MDM.
- Worked on session log files and workflow log files.
- Developed IDE Reports and IDQ Dashboard Reports and Data validations.
- Extensively worked on Teradata Parallel Transporter TPT for data integration from various heterogeneous sources and to eliminate bottlenecks in data load, Performed unit testing by generating SQL based on the Test Plans.
- Worked on Autosys to schedule the jobs and extensively used Unix Scripting and Scheduled PMCMD to interact with Informatica Server from command mode.
- Involved in writing shell scripts and added these shell scripts in Cron Job, as scheduled daily, weekly, monthly.
- Provided the Production support for the production issues by troubleshooting the session logs, bad files, error log tables and debugging the mappings.
- Maintained naming standards and warehouse standards for future application development, and also created functional and technical specification documents.
- Worked with oracle tool like TOAD.
- Worked on agile methodology.
Environment: Informatica Power Center 9.5, Teradata, SQL Server, Oracle 10g/11g, TOAD 9.7, UNIX Shell Scripting, Windows XP, Autosys, Tivoli Workload Scheduler 8.4.
Confidential
Informatica Developer
Responsibilities:
- Gathered requirements from users and translated them into technical specifications.
- Designed and developed various Informatica mappings using transformations like Expression, Aggregator, External Procedure, Stored Procedure, Normalizer, Lookup, Filter, Joiner, Rank, Router, Update Strategy and XML.
- Developed various Ad-hoc mappings for various business needs.
- Perform Designs and Analysis on business systems applications, systems interfaces, databases, reporting, or business intelligence systems
- Wrote Teradata SQL queries according to Process need .
- Used Teradata SQL Assistant Manager for database.
- Generated reports using Teradata BTEQ.
- Deliver new systems functionality supporting corporate business objectives.
- Translate requirements and high-level design into detailed functional design specifications.
- Responsible to tune ETL procedures and schemas to optimize load and query Performance.
- Interpreted logical and physical data models for Business users to determine common data definitions.
- Supported the BI team by extracting operational data from multiple sources, merging and transforming the data to facilitate enterprise-wide reporting and analysis and delivering the transformed data to coordinated data marts.
- Involved in Data Validating, Data integrity, Performance related to DB, Field Size Validations, Check Constraints and Data Manipulation.
- Coordinated with Business Users to understand business needs and implement the same into a functional Data warehouse design.
- Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data
- Implemented the concept of slowly changing dimensions to maintain current and historical data in the dimension.
- Used Autosys scheduler to schedule and run Informatica workflows.
Environment: Informatica 9.1, Oracle 11g, UNIX, PL/SQL, Oracle 11g, Teradata, UNIX Shell Scripting, Oracle 9i/10g/11g, PL/SQL, TOAD.
Confidential
Informatica Developer
Responsibilities:
- Extracted data from various heterogeneous sources like Oracle, SQL Server, Teradata, MS Access and Flat files.
- Worked with power center tools like Designer, Workflow Manager, Workflow Monitor, and Repository Manager.
- Worked on Designer tools like Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer and Mapping Designer.
- Used Informatica data services to profile and document the structure and quality of all data.
- Worked with IDQ to certify the quality of data, ensuring that the data is correct, complete, conforms to standards, and is consistent throughout the organization
- Extensively used Informatica Transformation like Source Qualifier, Rank, SQL, Router, Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all transformation properties.
- Solid Expertise in using both connected and unconnected Lookup Transformations.
- Extensively worked with various Lookup caches like Static cache, Dynamic cache and Persistent cache.
- Extensively worked with Joiner functions like normal join, full outer join, master outer join and detail outer join in the Joiner transformation.
- Created reports using Cognos from the data sources and distribute them to business users across the organization for better decisions and improved corporate performance.
- Used Loader utilities including SQL Loader and Teradata utilities BTEQ, FASTLOAD, FASTEXPORT, MULTILOAD
- Worked with Session logs and Work flow logs for Error handling and troubleshooting in Dev environment.
- Used Debugger wizard to troubleshoot data and error conditions.
- Responsible for Best Practices like naming conventions, and Performance Tuning.
- Developed Reusable Transformations and Reusable Mapplets.
- Worked extensively with session parameters, Mapping Parameters, Mapping Variables and Parameter files for Incremental Loading
- Worked with workflow System variables like SYSDATE and WORKFLOWSTARTTIME.
- Extensively used Various Data Cleansing and Data Conversion Functions in various transformations.
- Worked with Shortcuts across shared and non shared folders.
- Responsible for migrate the code using deployment groups across various Instances.
- Optimized SQL queries for better performance.
- Created pre sql and post sql scripts which need to be run at Informatica level.
- Responsible for Unit Testing of Mappings and Workflows.
- Developed Slowly Changing Dimension Mappings for Type 1 SCD and Type 2 SCD.
- Responsible for implementing Incremental Loading mappings using Mapping Variables and Parameter Files.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning.
- Implemented various loads like Daily Loads, Weekly Loads, and Quarterly Loads using Incremental Loading Strategy.
Environment: Informatica PowerCenter 8.6, Informatica Power Exchange, Toad 8.6, Erwin 4.2/7.x, Cognos Impromptu, Oracle 11i,Teradata, SQL server , SSIS 2005, Access, Flat files, SQL/PLSQL, UNIX shell scripting, Windows XP.
Confidential
ETL Developer
Responsibilities:-
- Involved in gathering Business Information from Business Analyst.
- Involved in conversion of Business Specifications into Technical Specifications.
- Uploading data from source transactional system into staging area.
- Extensively used various transformations like Filter, Router, Sequence Generator, Look Ups, Update Strategy, Joiner, Source Qualifier, Expression, Sorter and Aggregator.
- Extensively used Mapping Variables, Mapping Parameters and Parameter Files for capturing delta loads.
- Extensively worked on various re-usable tasks, workflows, Worklets, mapplets and re-usable transformations, which saves design time and effort.
- Worked on Business Intelligence tool Business Objects XI R2.
- Created Reports using Charts, cross-tab, Sub-reports, Running Totals.
- Created Templates for standardizing Corporate Document structure.
- Extensively worked on various Look up Caches like Static, Dynamic, Persistent, and Shared Caches.
- Worked on session logs, Informatica Debugger, and Performance logs for error handling when we had workflows and session fails.
- Worked extensively with the business intelligence team to incorporate any changes that they need in the delivered files.
Environment: Informatica Power Center 8.1, Teradata, SQL Server, Oracle 9i, Business Objects XI R2, Toad.
Confidential
ETL/SQL Developer Responsibilities:
- Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management.
- Designed Sources to Targets mapping from primarily Flat files to Oracle using Informatica Power Center.
- Created global repository, Groups, Users assigned privileges using repository manager.
- Involved in developing source to target mappings and scheduling Informatica sessions
- Various kinds of the transformations were used to implement simple and complex business logic. Transactions used were: connected unconnected lookups, Router, Expressions, source qualifier, aggregators, filters, sequence generator, etc.
- Extensively worked in Oracle SQL, PL/SQL, Query performance tuning, Created DDL scripts, Created database objects like Tables, Indexes, Synonyms, and Sequences etc.,.
- Tuned Informatica Mappings and Sessions for optimum performance
- Worked in writing UNIX scripts, SQL statements and interacted with development and production teams to expedite the process.
Developed UNIX shell scripts to run the pmcmd functionality to start and stop sessions.
Environment: Informatica Power Center 7.1.3, Oracle 8i, SQL Server 2000, SQL, PL/SQL, VSAM files.