Sr. Etl Informatica / Teradata Developer Resume
Smithfield, RI
SUMMARY:
- Around 8 years of experience in Information Technology with emphasis on Data Warehouse/Data Mart development using developing strategies for extraction, transformation, and loading (ETL) in Informatica Power Center 10.1/9.6.1/ 9.5.1/9.1.1/8.6.1/8.1.1/7.1.2 from various database sources.
- Strong work experience in ETL life cycle development, performed ETL procedure to load data from different sources into data marts and data warehouse using Power Center, Designer, Workflow Manager and Workflow Monitor.
- Involved in Informatica upgrade projects from one version to another version.
- Involved in understanding of Business Processes, identification of dimensions and Facts for OLAP applications.
- Strong knowledge of understanding Data Modeling.
- Comprehensive experience of working with Type1, Type2 methodologies for Slowly Changing Dimensions (SCD) management.
- Proficient in the Integration of various data sources with multiple relational databases like Oracle, MS SQL Server, DB2, My Sql and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
- Extensive experience in developing Stored Procedures, Functions, Triggers and Complex SQL queries.
- Experience on Debugger to validate the mappings and gain troubleshooting information about data and error conditions.
- Good working knowledge of various Informatica designer transformations like Source Qualifier, Dynamic and Static Lookups, connected and Unconnected lookups, Expression, Data Masking, Filter, Router, Joiner, Normalizer and Update Strategy transformation.
- Extensively used Teradata Utilities like Tpump, Fast - Load, Multi-Load, BTEQ and Fast-Export.
- Involved in Sql performance Tuning.
- Worked on Performance Tuning , identifying and resolving performance bottlenecks in Informatica.
- Expereience in integrating Hadoop with Informatica Power Center also involved in moving Hadoop process to Informatica process. We have done the end to end development for the data the client needed. Also, we worked on maintaining the master data using Informatica MDM.
- Experience in Task Automation using UNIX Scripts, Job scheduling and communicating with Server using pmcmd .
- Extensively used Autosys, Control-M and Maestro for Job monitoring and scheduling along with Production on call support. HP Vertica Event Log Processing on AWS Project - designed, implemented all components of the MySQL event web log processing data mart
- Highly motivated and goal-oriented individual with a strong background in SDLC Project Management and Resource Planning using Agile methodologies & Waterfall Methodologies.
- Excellent interpersonal and communication skills.
TECHNICAL SKILLS:
ETL Tools: Informatica Power Center 10.1/ 9.6.1/9.5.1 /8.6.1/8.1.1, Informatica Cloud
RDBMS: Oracle 10g/9i/8i, Teradata 14/12, DB2, SQL Server 2000/2005/2008, MySQL, Sybase
Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio.
QA Tools: Quality Center
Operating System: Windows, Unix, Linux
Reporting Tools: Cognos, Business Objects
Languages: Java, XML, UNIX Shell Scripting, SQL, PL/SQL, Perl
PROFESSIONAL EXPERIENCE:
Confidential, Smithfield, RI
Sr. ETL Informatica / Teradata Developer
Responsibilities:
- Involved in coding, testing, implementing, debugging and documenting the complex programs.
- Used the Data Masking transformation to change sensitive production data to realistic test data for non-production environments.
- Created Mappings with shared objects, Reusable Transformations and Mapplets.
- Working with different database Teradata, DB2.
- Used U.D.F to handle different data rules.
- Prepared optimized SQL queries to handle performance bottlenecks.
- Worked with different sources like Oracle, Sql Server, Flat Files.
- Extensively used SQL Overrides in Lookups, Source filter and Source Qualifiers.
- Created UNIX shell scripts to kick off Informatica workflow in batch mode.
- Invoked Informatica workflows using “pmcmd” utility from the UNIX script.
- Developed Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server and Oracle PL/SQL.
- Teradata Data Warehouse and Utilities (BTEQ, TPT AND FAST LOAD ).
- Expertise in Oracle database development using SQL and PL/SQL, Stored Procedures, Triggers, Functions and Packages and performance tuning of SQLs.
- Build high quality, scalable and resilient distributed systems that help in business analytics to make time sensitive and critical decisions.
- IDW Development of ETL and Semantic components . Used control framework, metadata and flute. Performed complex defect fixes in various environments like UAT to ensure the proper delivery of the developed jobs into the production environment .
- Designed and developed complex ETL mappings by making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations.
- Responsible for efficient monitoring of the process through regular ETL load status check.
- Proactively took responsibilities for internal organizational processes like Quality Management & Employee Performance management.
- Created Parameter files and validation scripts, Created Reusable and Non-Reusable command task in Workflow Manager.
- Performed migration of informatica components to informatica cloud.
- Worked on the installation and setup ETL (Informatica cloud) applications on Linux servers.
- Created Sessions, command task, reusable worklets and workflows in Workflow Manager.
- During the course of the project, participated in multiple meetings with the client and data architect / ETL architect to propose better strategies for performance improvement and gather new requirements.
- Provided production support for business users and documented problems and solutions for running the workflows.
- Created and scheduled Jobs based on demand and run on time using scheduling tool Control-M . Created UNIX shell script to FTP flat files from different ordering systems to the ETL server .
Environment: Informatica Power Center 9.6.1, Teradata, Oracle 11g, Unix, SQL Server,sFlat files, Quality Center, puTTY, Winscp3,Toad
Confidential, Southlake,TX
Sr. ETL Informatica / Teradata Developer
Responsibilities:
- Involved in analysis of source systems, business requirements and identification of business rule and responsible for developing, support and maintenance for the ETL process using Informatica PC.
- Involved in informatica Upgrades process from one version to other versions.
- Created / updated ETL design documents for all the Informatica components changed.
- Extracted data from heterogeneous sources like oracle, DB2, XML, Flat File and perform the data validation and cleansing in staging area then loaded in to Data Warehouse Teradata using Teradata Utilities & Informatica.
- Written Teradata BTEQ, M-LOAD, F-LOAD also used TPT in Informatica.
- Loading the data from source to stage and stage to core/base.
- Made use of reusable Informatica transformations, shared sources and targets.
- Created Informatica transformations/mapplets/mappings/tasks/worklets/workflows using Power Center to & Variables.
- Worked on Data Masking transformation to process the confidential claims data for the testing purposes.
- Implemented various loads like daily loads, weekly loads, and quarterly loads and on demand load using Incremental loading strategy and concepts of changes Data Capture (CDC).
- Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
- Created mappings for Type1, Type2 slowly changing dimensions (SCD) / complete refresh mappings.
- Extensively used various Data Cleansing and Data Conversion functions like LTRIM, RTRIM, TO DATE, Decode, and IIF functions in Expression Transformation.
- Extensively used the Workflow Manager tasks like Session, Event-Wait, Timer, Command, Decision, Control and E-mail while creating worklets/workflows.
- Designing and development of ETL Jobs using HP Vertica , Datastage Teradata and Linux shells
- Involved and experienced in working with Hadoop environment - Hive QL.
- Worked with reporting team in helping with EDW data for their reports using Business Objects
- Worked with “pmcmd” command line program to communicate with the Informatica server, to start, stop and schedule workflows.
- HP Vertica Event Log Processing on AWS Project - designed, implemented all components of the MySQL event web log processing data mart
- Involved in supporting EDW environment 24*7 rotation system and strong grip in using scheduling tool Maestro.
- During the course of the project, participated in multiple meetings with the client and data architect / ETL architect to propose better strategies for performance improvement and gather new requirements.
Environment: Informatica Power Center 9.6.1/9.5.1, Oracle 11g, DB2, XML, Flat Files, Teradata 14/12, Hadoop, Hive QL, Maestro, UNIX, Windows, Toad.
Confidential, Bentonville, AR
ETL Informatica / Teradata Developer
Responsibilities:
- Involved in requirements gathering, analysis, function/technical specifications, development, deploying and testing.
- Prepare/maintain documentation on all aspects of ETL processes to support knowledge transfer to other team members.
- Used Informatica Power Center for migrating data from various OLTP databases to the data mart
- Worked with different sources like Oracle, Sql Server, Flat Files, XML.
- Created Mappings using Mapping Designer to load data from various sources, made use of various Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Normalizer, Rank, Router, Sequence generator, Union and Update Strategy transformations.
- Created mapplets using Mapplet Designer and used those Mapplets for reusable business process in development.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
- Performance tuning of Informatica designer and workflow objects.
- Extracted Axiom source data and loaded into target.
- Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database
- Created pre/post session commands and pre/post session SQLs to perform tasks before and after the sessions.
- Used the Data Masking transformation to change sensitive production data to realistic test data for non-production environments.
- The objective was to extract data from Flat Files and Oracle database and to finally load it into a single data warehouse repository, which was in Oracle, facets .
- Build a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
- Implemented parallelism in loads by partitioning workflows using Pass-through partitions.
- Worked with Teradata View point for monitoring Workload management.
- Implemented slowly changing dimensions (Type I and Type II) for customer Dimension table loading.
- Created UNIX KSH shell scripts to kick off Informatica workflow in batch mode.
- Invoked Informatica using “pmcmd” utility from the UNIX script.
- Involved in Unit testing, Iterative testing to check whether the data loads into target are accurate, which was extracted from different source systems according to the user requirements.
- Provided support for the applications after production deployment to take care of any post-deployment issues.
Environment: Informatica Power Center 9.5.1, IDQ, Oracle 10g, Sql Server, Axiom,Teradata 14.0, Flat File, XML, UNIX shell scripting.
Confidential, Phoenix, AZ
Informatica Developer
Responsibilities:
- Analyzed business requirements and worked closely with various application teams and business teams to develop ETL procedures that are consistent across all applications and system.
- Wrote Informatica ETL design documents, establish ETL coding standards and perform Informatica mapping reviews.
- Extensively worked on Power Center Client Tools like Power Center Designer, Workflow Manager, and Workflow Monitor.
- Analyzed the source data coming from different sources and worked on developing ETL mappings/mapplets.
- Extracted data from Relational databases, Legacy Systems, XML files and Flat files to Oracle database using Informatica Power Center.
- Made use of various Designer transformations like Source Qualifier, Connected and Unconnected Lookups, Expression, Filter, Router, Sorter, Aggregator, Joiner, Normalizer, Rank, Sequence generator, Union and Update Strategy transformations.
- Created Parameter files and validation scripts, Created Reusable and Non-Reusable command task in Workflow Manager.
- Created Sessions, command task, reusable worklets and workflows in Workflow Manager.
- Created UNIX KSH shell scripts to kick off Informatica workflow in batch mode.
- Invoked Informatica using “pmcmd” utility from the UNIX script.
- Involved in the end to end testing Process.
Environment: Informatica Power Center 8.6.1/9.1.1, Oracle 10g, Flat Files, XML Files, UNIX.
Confidential
Informatica Developer
Responsibilities:
- Developed ETL mappings, Transformations and Loading using Informatica Power Center 8.6.1.
- Extensively used ETL to load data from Flat file, MS Excel, which involved both fixed width as well as Delimited files and also from the relational database, which was Oracle 10g.
- Developed and tested all the Informatica mappings, sessions and workflows - involving several Tasks.
- Worked on Dimension as well as Fact tables, developed mappings and loaded data on to the relational database.
- Worked extensively on different types of transformations like source qualifier, expression, filter, aggregator, update strategy, Lookup, sequence generator, joiner, Stored Procedure.
- Analyzed the session, event and error logs for troubleshooting mappings and sessions.
- Provided support for the applications after production deployment to take care of any post-deployment issues.
Environment: Informatica 8.6.1, Oracle 10g, Flat Files, SQL Programming, Unix, Windows, MS Excel, SQL *Plus.