Etl Informatica Developer With Teradata Architecture Resume
Burbank, CA
SUMMARY:
- Over eight (8+) years of IT experience focusing on Informatica Power Center, Teradata, Oracle, DB2, Cognos, and Unix shell scripting.
- Designed and developed Complex mappings from various Transformations like re - usable transformations, and Mappings/Mapplets, Unconnected/Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy and more, mappings using Stored Procedure’s, Normalizer, SQL Transformation.
- As a ETL Developer involved in designing and implementing Data Mart/Data Warehouse applications using Informatica Power Center 10.1.0/9.5.1/9. x/ 8.x/7.x Designer, Workflow manager, Workflow monitor and Repository Manager, Admin Console.
- Hands-on experience with Informatica MDM Hub configurations - Data Mappings (Landing, Staging and Base Objects), Data Validation, Match and Merge Rules, Batch jobs, Customizing/configuring Informatica data director(IDD) applications among others.
- Created Rule based files (RBL) files for custom based linking in Informatica metadata manager for generating data linage.
- Created custom modals in Informatica metadata manager for data governance and impact analysis.
- Experience in identifying Bottlenecks in ETL Processes and Performance tuning of the production applications using Database Tuning, Partitioning, Index Usage, Aggregate Tables, Session partitioning, Load strategies, commit intervals and transformation tuning.
- Good knowledge on SDLC (software development life cycle) and good experience with unit testing and integration testing.
- Well acquainted with Informatica Designer Components - Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet and Mapping Designer.
- Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected/Connected Lookups and Aggregators.
- Strong Experience in developing Sessions/tasks, Worklets, Workflows using Workflow Manager Tools - Task Developer, Workflow & Worklet Designer.
- Strong experience in Creating Database Objects such as Tables, Views, Functions, Indexes, Triggers, Cursors in Teradata.
- Extensively worked with Teradata utilities like BTEQ, Fast export, Fast load, Multi load, TPT and Tpump to export and load data to/from different source systems including flat files.
- Extensive experience on Teradata database, analyzing business needs of clients, developing effective and efficient solutions and ensuring client deliverables with in committed timelines.
- Strong understanding of Data modeling, Data analysis, implementations of Data warehousing using Windows and UNIX.
- Good knowledge of Data Warehouse concepts and principles (Kimball/Inman) - Star schema, Snowflake, Surrogate Keys, and Normalization/De-normalization.
- Proficient in developing solutions for Slowly Changing Dimensions (Type 1 & Type 2), Incremental Loading, Incremental Aggregation, Constraint-Based Loading and Changed data capture(CDC).
- Expertise in maintaining data quality, data organization, metadata and data profiling.
- Drawing on Experience in all aspects of analytics/data warehousing solutions (Database issues, Data modeling, Data mapping, ETL development, metadata management, data migration and reporting solutions).
- Experience in Business analysis and Data analysis, User requirement gathering, User requirement analysis, Data cleansing, Data transformations, Data relationships, Source systems analysis and Reporting analysis.
- Strong Knowledge of database architecture for OLTP and OLAP applications, Data Analysis and ETL processes.
- Experience in using Automation Scheduling tools like Autosys and Control-M
- Created Mapping documents, Work flows, and Data dictionaries .
- Very good understanding of Reporting tools like Cognos and OBIEE.
- Experience in writing UNIX shell scripts to support and automate the ETL process.
TECHNICAL SKILLS:
OS:: UNIX (AIX), Windows NT/2000/XP.
Databases:: Teradata, Oracle 11g/10g/9i/8i, DB2 v 8.1., and MS SQL Server 2008.
ETL Tools:: Informatica Power Center 7.x/ 8.6.1/9.1/10.1.0, Metadata Manager v 10.x., CDC, Data Integrator.
Scheduling Tools: Autosys, Control-M
Data modeling tools: Erwin 4.0, Sybase Power Designer 16.1
Methodologies: SDLC, Ralph Kimball Methodology
Languages: SQL, PL/SQL, TSQL, Java and UNIX Shell Scripting
PROFESSIONAL EXPERIENCE:
Confidential, Burbank, CA
ETL Informatica Developer
Responsibilities:
- Involved in identifying all the data models for the various Datasets
- Involved in Creation of Data Profiles for various tables coming from multiple sources and analyzing the same.
- Design ETL architecture and process.
- Involved in various design discussions and requirement discussions with the end users and client
- Created ETL Mapping specifications using functional specifications.
- Created Technical Specifications document based on functional specifications.
- Developed complex ETL mappings that involve parallel processing of multiple instances basing on certain parameters in control table.
- Involved in tuning Informatica ETL mappings analyzing them thoroughly.
- Involved in identifying various bottle necks at different levels (database, etl, etc…) and coming up with solution to increase the performance.
- Worked on creating MDM Landing tables and loading them using ETL Informatica power center and scheduled them using Autosys scheduler.
- Created Custom modal in Informatica Meta data manager for data governance and impact analysis.
- Loaded Business glossary in Informatica metadata manager tool for business dictionary and created rule based linking between resources for generating data linage.
- Worked on Informatica Metadata manager for creating Data linage using custom modal and RBL’s.
- Worked on creating business glossary using Informatica metadata manager for linking physical objects to business dictionary.
- Created a model for Audit mechanism and included the Audit counts in each of the ETLs to verify the source and target counts and sums
- Identified various extracts needed for downstream applications and designed structures for the same
- Performance Tuning of both ETL and Database side to minimize the data extraction and processing times
- Data standardization of Disney/Industry titles using Informatica IDQ coming from various sources
- Create Unit Test plans for various ETLs developed.
- Performed Unit Testing and Integration testing for the ETL’s
Confidential, CA
ETL Informatica Developer
Responsibilities:
- Involved in the requirement definition and analysis in support of Data Warehouse efforts.
- Involved in design and development of data warehouse environment, liaison to business users and/or technical teams gathering requirement specification documents and presenting and identifying data sources, targets and report generation.
- Worked closely with the end users in writing the functional specifications based on the business needs.
- Provided data modeling support for numerous strategic application development projects.
- Studied the source system by Reverse Engineering.
- Fully documented proposed solutions including data dictionary and related metadata elements.
- Developed standards and procedures for transformation of data as it moves from source systems to the data warehouse.
- Created Technical design specification documents based on the functional design documents and the physical data model.
- Developed mappings to extract data from Oracle, Flat files, Delimited Files and load into Data warehouse using the Mapping Designer.
- Created data linage using Meta data manager for impact analysis and data governance.
- Created custom resources to load external meta data into Informatica metadata manager repository for end to end application linage.
- Developed Reusable Mapplets and Transformations for reusable business calculations.
- Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Designer to map the sources to the target.
- Created complex mappings in Power Center Designer using Expression, Filter, Sequence Generator, Update Strategy, Joiner and Stored procedure transformations.
- Implementing methods to validate that data supplied by external sources were loaded correctly into the awards database.
- Used Informatica Designer to create complex mappings using different transformations like Filter, Router, Connected & Unconnected lookups, Joiner, Update Strategy, Expressions and Aggregator transformations to pipeline data to Data Warehouse.
- Tested all the business application rules with test and live data and automated, monitored the sessions using Work Flow Manager and Workflow Monitor
- Created, launched & scheduled Workflows/sessions. Involved in the Performance Tuning of Mappings and Sessions.
Confidential, NY,
ETL Informatica Developer
Responsibilities:
- Involved in Full life cycle design and development of Data warehouse.
- Involved in Fact - Dimension model design using Star Schema Model.
- Involved in ETL Architecture and Design
- Developed logical data models and physical data models with experience in Forward and Reverse Engineering using Erwin.
- Involved in design for development of 12 Primary Dimensions and 8 Secondary Dimensions as per the Data Warehouse Requirements
- Designed and created Reusable data flows that can use used in other workflows with common functionality
- Involved in design and creation of common Oracle Stored Procedures that load data from various transactions into Dimensions and Fact.
- Implemented the functionality of source - target count reconciliation and source - target check sum reconciliation.
- Implemented lot of performance tuning steps at data flow level to improve the performance of the execution.
- Developed SCD 1, SCD 2 and SCD 3 data flows for all the Primary Dimensions.
- Created an Issue Log to identify the errors and used it for preventing any such errors in future development works.
- Prepared Test Cases for dimensions and transactions as per the user requirements to validate the data loaded in Data Warehouse.
- Loaded small set of data in QA environment to validate the data with the Test Cases prepared.
- Migrated the codes from QA to Prod and performed the initial Historic Load and scheduled the Ongoing Loads using Data Integrator Scheduler.
Confidential, Monroe, NC
ETL Informatica Developer
Responsibilities:
- Analyze business requirements to build a data mart design for various business processes conformed to the business rules.
- Involved in developing OLAP models like facts, measures and dimensions (Data Models) and drafted the ETL specifications.
- Maintained warehouse metadata, naming standards and warehouse standards for future application development.
- Involved in business analysis and technical design sessions with business and technical staff to develop Entity Relationship/data models, requirements document, and ETL specifications.
- Used Informatica Designer to create complex mappings, transformations, source and target tables.
- Administer the repository by creating folders and logins for the group members and assigning necessary privileges.
- Used Workflow Manager for creating, validating, testing and running the sequential and concurrent sessions and scheduling them to run at specified time and as well to read data from different sources and write it to target databases.
- Used Exception handling logic in all mappings to handle the null values or rejected rows
- Query Optimization for improving the performance of the data warehouse.
- Developed standard and re-usable mappings and mapplets using various transformations like expression, aggregator, joiner, source qualifier, router, lookup Connected/Unconnected, and filter.
- Involved in performance tuning of the mappings, sessions and workflows.
- Performed Type1 and Type2 mappings
- Used session parameters and parameter files to reuse sessions for different relational sources or targets.
- Developed PL/SQL and UNIX Shell Scripts using VI editor.
- Involved in Data Extraction from Oracle and Flat Files using SQL Loader.
- Strong experience on data quality, data cleansing using data explorer.
- Experience in PL/SQL Programming (Stored procedures, Triggers, Packages) using Oracle (SQL, PL/SQL), Sybase.
- Created documentation on mapping designs and ETL processes.
- Maintain Development, Test and Production mapping migration Using Repository Manager.
- Developed Korn shell scripts for Informatica pre-session, post session procedures
- Involved in Unit testing, User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems per the user requirements.
Confidential, VA
ETL Developer
Responsibilities:
- Involved in a team in designing the Database Schema, for the metadata for storing the informative queries that are generated dynamically.
- Identifying integrity constraints for tables.
- Involved in writing the Triggers which internally calls procedures and functions.
- Tested the application to ensure proper functionality, data accuracy, and that modifications have no adverse Impact on integrated system environment
- Experience with Data flow diagrams, Data dictionary, Database normalization theory techniques, Entity relation modeling and design techniques.
- Expertise in Client-Server application development using Oracles, PL/SQL, SQL *PLUS, TOAD and SQL*LOADER.
- Effectively made use of Table Functions, Indexes, Table Partitioning, Collections, Analytical functions, Materialized Views, Query Re-Write and Transportable table spaces.
- Strong experience in Data warehouse concepts, ETL.
- Good knowledge on logical and physical Data Modeling using normalizing Techniques.
- Created Tables, Views, Constraints, Index (B Tree, Bitmap and Function Based).
- Developed Complex database objects like Stored Procedures, Functions, Packages and Triggers using SQL and PL/SQL.
- Developed materialized views for data replication in distributed environments.
- Involved in documenting the entire process.
- Involved in Implementation and Testing phases of the system
- Worked with different Sources such as DB2, SQL Server and Excel, Flat.
- Used Informatica to extract data into Data Warehouse.
- Extensively worked with various lookup caches like Static Cache, Dynamic Cache and Persistent Cache.
- Used Update Strategy DD INSERT, DD UPDATE to insert and update data for implementing the Slowly Changing Dimension Logic.
- Developed Slowly Changing Dimensions Mapping for Type 1 SCD and Type 2 SCD.