We provide IT Staff Augmentation Services!

Informatica Developer Resume

3.00/5 (Submit Your Rating)

Columbus, OH

SUMMARY:

  • Experience with Data Warehousing using Informatica Power center 9.x/8.x/10.2.0 Informatica Power Mart 7.x/6.x, Power Connect, Power Exchange, IDQ
  • Experience in Developing Complex Mappings, Teradata RDBMS Architecture, Applications and Tools & Utilities, Mapplets and Transformations using Router, Normalizer, Joiner, Look up, Update Strategy, etc. between Source and Target using Informatica PowerCenter Designer.
  • Extensive experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Normalizer, Source Qualifier, Router, Filter, Expression, Aggregator, Joiner, Update Strategy.
  • Extensively worked on Informatica BDM 10.1.1 IDQ and Informatica Power center throughout complete Data Quality projects
  • Used Informatica BDM IDQ 10.1.1 (Big Data Management): To inject the data from AWS S3 raw to S3 refine and from refine to Redshift
  • Implemented data warehousing methodologies for Extraction, Transformation and Loading using Informatica Designer, Repository Manager, Workflow Manager, Workflow Monitor, Repository Server Administration Console.
  • Experience in Informatica Administration using Informatica PowerCenter, Informatica data quality 9.x/8.x which includes Installation, up - gradation, migration, configuration, maintenance, jobs scheduling, systems monitoring was a constant to insure uninterrupted data and batch job scheduling processing, technical support and troubleshooting of Informatica ETL Server implementations in shared and dedicated environments, user access maintenance, folders creation, connections creation, code deployment, Table/view imports, query metadata from repository, code change management, repository backup etc.
  • Proficient in Informatica Designer Components Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.
  • Efficient in troubleshooting and problem solving applications, Validating, Testing and Debugging Errors with in ETL Transformations with Informatica Debugger.
  • Experience on post-session and pre-session shell scripts for tasks like merging flat files after Creating, deleting temporary files, changing the file name to reflect the file generated date etc.
  • Expertise in trouble shooting, performance tuning, and optimization of ETL and reporting analysis.
  • Strong understanding of quality processes and development of highly reliable solutions.
  • Experience in all aspects of Project Development Life Cycle in Data Warehousing like Requirements gathering, Analysis, Design, Implementation, Testing and Maintenance.
  • Worked through all phases of Software Development Life Cycle (SDLC)- Waterfall and Agile (Scrum)
  • Worked on Slowly Changing Dimensions (Type1, 2, 3) and its implementation to keep track of historical data.
  • Used TOAD, SQL Developer, SQL Server Management Studio for testing SQL queries.
  • Extensive working experience in databases like Oracle, SQL Server
  • Expertise in Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Power Center Mappings.
  • Ability to upgrade to new technologies.
  • Good Experience on Informatica Data Quality

TECHNICAL SKILLS:

Languages: PL/SQL,TSQL, Unix Shell Script, Kernel Programming

Primary Skills: Teradata SQL, Teradata Utilities, AIX 5.1/5.3 (UNIX), Shell Scripting, C++, Data Warehousing architecture. Informatica Powercenter 10.x/9.x/8.x/7.x, Informatica developer, Informatica BDM IDQ 10.1.1.

Database: Oracle 12g/11g/10g/9i/8i, SQL Server (2012), DB2, SAP BW and MS Access, Teradata V2R5/V2R6

Operating Systems: UNIX(Sun-Solaris),Backtrack R3, Windows 10/8/7/XP, Linux.

Data Modeling: MS Visio 2010,Star Schema Modelling, Snow Flake Modelling

Database Tools: TOAD, SQL *Plus, SQL Developer, SQL Server Management Studio

Reporting Tools: Business Objects, OBIEE

PROFESSIONAL EXPERIENCE:

Confidential, Columbus, OH

Informatica Developer

  • At Confidential I worked as a senior BI/ETL Developer. I worked on application enhancements and support at Galaxy Data Warehouse. I was involved in design, development, testing, support and monthly release activities.
  • As a software Engineering professional, I worked across the Service Delivery Lifecycle to analyze, design, build, test, implement and/or maintain multiple system components or applications.
  • Worked on the Rapid Response area of the Marketing team, as part of the team worked on critical ETL abends by navigating through Informatica and Linux logs and shell script files.
  • Debug complex Linux shell scripts and identify the issues in the code and the data files, resolve them in short amount of time to bring the production back up.
  • Worked on providing detailed analysis documentation related to incidents and work requests in Service now tool, this helps the team in resolving future issues and work on enhancements.
  • Use CA Workstation scheduler tool to monitor the production jobs and also trigger and simulate jobs in lower environments to test the projects.
  • Injected data from S3-Raw Bucket and loaded into HBASE and Hive by using Informatica BDM IDQ.
  • By using Informatica BDM refined the data with the scope of delta processing and for the cleansing the data from S3 Raw to S3 Refine.
  • Extensively worked on Teradata and Oracle database to check the data loaded in the database and the Informatica ETL parameters defined.
  • Debug complex Teradata queries and fix them on the fly to resolve complete production issues.
  • Environment: Teradata 14, Fast Load, Multi Load, Fast Export, Teradata Administrator,Informatica Power Center 9.6/10.2.0, UNIX, Informatica BDM 10.1.1,Windows server 2008 R2, Unix shell programming, Oracle 11g/10g, SQL, PL/SQL, Teradata 15.10/16.00 , Fast Load, Multi Load, Fast Export.

Confidential, Atlanta, GA

Teradata Developer

  • Involved in Requirement Analysis, System Design, Preparing Functional Specifications/Design
  • Created tables,shell views in Confidential Teradata environment for Staging,CurrentLoad and IDS layers.
  • Created Views on top of the final EDW tables in order to generate the reports.
  • Worked extensively with complex mappings using different transformations like Source
  • Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups,
  • Aggregators, and Re-usable transformations Migration of scripts from oracle10g to oracle 11g for Confidential Legacy Providers management.
  • Created BTEQ scripts for pre population of the work tables prior to the main load process.
  • Used FastLoad for loading into the empty tables.
  • Involved in developing MLOAD scripts to load data from Load Ready Files to Teradata Warehouse.
  • Perform projects via Software Development Life Cycle ( SDLC) projects. Ensure the phases like Requirements Gathering, Analysis/Design, Development and Testing are followed before production implementation.
  • Create Technical design documents and developing unit test documents
  • Perform data troubleshooting and development. Monitor systems health and provide instant solutions to system performance issues.
  • Tuned complex Stored Procedures using Teradata Explain for faster execution and Developed database structures, according to the requirements.
  • Developed various complex quries, stored procedures, packages, interfaces and triggers in PL/SQL.
  • Performance tuning using explain plan, AWR report and Query analysis.
  • Preparation of project Requirement Specification Document by analyzing the requirements.
  • Perform query optimization, performance and tuning (PL/SQL) using SQL Trace, TKPROF, Explain Plan
  • Environment: Teradata 14, Fast Load, Multi Load, Fast Export, Teradata Administrator,Informatica Power Center 9.6, UNIX, Windows server 2008 R2, Unix shell programming, Oracle 11g/10g, SQL, PL/SQL, Salesforce Exact Target, CA, UC4, IDQ 8.6.1, Teradata 14, Fast Load, Multi Load, Fast Export.

Confidential, Nashville, TN

Sr. ETL/Teradata Consultant

  • Used most of the transformations such as the Transaction Control, JAA, SQL transformation, Routers, Sequence Generator, Expression and Lookups (connected and unconnected) while transforming the Data according to the business logic.
  • Achieved performance improvement by analyzing explain plan, identifying and defining indexes, collect statistics on columns that are used in joins and compression on appropriate attributes to save the disk space.
  • Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level. Worked on Informatica Key range Partitions on session level to increase performance.
  • Shell scripts that invoke BTEQ scripts that load the Fact tables and the dimension tables.
  • Work with the Cognos team for any changes to the reporting layer.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations.
  • Scheduling the sessions to extract, transform and load data in to warehouse database on Business requirements.
  • Migrating Folders, Mappings, Workflows across Repositories, Folders and to the higher version of Informatica
  • Did majority of the ETL design, including Process specifications, Source to Target mappings, selection of the appropriate Teradata Extract/Load tools.
  • Co-coordinated improvements and enhancements to the data marts, by acting as a bridge between the users and the development teams.
  • Performance tuned ETL Batch jobs, Reporting and Ad-hoc SQL.
  • Created stored procedures, macros, triggers, views, tables and other database objects.
  • Modified MVS JCL load scripts to support various data lengths for BTEQ and FASTLOAD.
  • Worked on loading of data from several flat files sources to Staging using Teradata Multiload, FastLoad.
  • Exported numerous files, created through extensive transformation of the data, to different vendors.
  • Provided on call support for the entire Enterprise Data Warehouse processes. Solved numerous issues arising from batch processing in minimal time.
  • Working with O&M Production Implementation team for Wireline and Wimax Customers.
  • Responsible for solving issues related to Order Management, Inventory Management and provisioning.
  • Support to End-users for Metasolv related queries on operational and provisioning activities.
  • Deploy change requests to customer's test and production environments.
  • Worked as a part of Fault Management System (FMS), Weblogic Integration(WLI).
  • Implementing adhoc techniques to achieve business needs.
  • Assisted another Teradata DBA to bring the system in production.
  • Environment: Informatica Power Center 9.6, UNIX, Windows server 2008 R2, Unix shell programming, Oracle 11g/10g, SQL, PL/SQL, Salesforce Exact Target, CA, UC4, IDQ 8.6.1, Teradata 14, Fast Load, Multi Load, Fast Export.

Confidential, Atlanta, GA

Informatica/Teradat Developer

  • Requirements Analysis, cost estimates, technical design creation and design reviews
  • Worked on Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor
  • Extracted data from various centers with the data in different systems like Oracle Database and SQL Server and loaded the data into Teradata staging using Informatica Power Center 9.5.
  • Involved in Migration from Informatica Power Center 9.3 to Informatica Power Center9.5
  • Used the Teradata utilities like BTEQ,MLOAD,fastload,tpump and fast export to get the data into the system and exporting it to the other database.
  • Implementing the unique features of Teradata such as Identity columns and XML loaders UDFs
  • Implemented the concept of slowly changing dimensions (SCD) Type I and Type II to maintain current and historical data in the dimension.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, duplicate elimination and exception handling and monitoring capabilities of IDQ.
  • Created critical re-usable transformations, mapplets and worklets wherever it is necessary
  • Integrated IDQ mappings, rules as mapplets within Power Center Mappings.
  • Created Stored Procedures, Functions, Packages and Triggers using PL/SQL.
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer, to run SQL queries to validate the data.
  • Involved in debugging Informatica mappings, testing of Stored Procedures and Functions, Performance and Unit testing of Informatica Sessions, Batches and Target Data.
  • Implemented restart strategy and error handling techniques to recover failed sessions.
  • Used Unix Shell Scripts to automate pre-session and post-session processes.
  • Worked on (MLoad, FLoad) to load the large tables
  • Used Autosys scheduler to schedule and run Informatica workflows on a daily/weekly/monthly basis.
  • Reviewed the defects raised by the UAT team and following up on the critical defects with the team to ensure to fix the defects
  • Environment: Informatica Power Center 9.6, UNIX, Windows server 2008 R2, Unix shell programming, Oracle 11g/10g, SQL, PL/SQL, Teradata 12, Fast Load, Multi Load, Fast Export, Teradata Administrator, Teradata PMON, BTEQ, TPT, CA, UC4, IDQ 8.6.1, Teradata 14, Fast Load, Multi Load, Fast Export.

Confidential, Roswell, GA

Teradata Consultant

  • Resolved issues related to semantic layer or reporting layer.
  • Worked on different subject areas like Customer, Loyalty, Promotion and Item.
  • Written complex SQLs using joins, sub queries and correlated sub queries. Expertise in SQL Queries for cross verification of data.
  • Developed the Teradata Macros, Stored Procedures to load data into Incremental/Staging tables and then move data from staging into Base tables.
  • Reviewed the SQL for missing joins & join constraints, data format issues, miss-matched aliases, casting errors.
  • Responsible for Design, Data Mapping Analysis, Mapping rules, Development, Coding & testing.
  • Responsible for Implementation & Post Implementation support.
  • Extensively used loader utilities to load flat files into Teradata RDBMS.
  • Responsible for Analyzing the UNIX shell scripts that are existing in BCI to get the business rules and then created the mappings between the source fields and the target table fields.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Teradata RDBMS.
  • Performed high volume maintenance on large Teradata tables using MultiLoad loader utility.
  • Used Fast Export utility to extract large volumes of data at high speed from Teradata RDBMS.
  • Developed TPump scripts to load low volume data into Teradata RDBMS at near real-time.
  • Performed tuning and optimization of application SQL using Query analyzing tools.
  • Environment: Teradata V2R14, Teradata SQL Assistant, MLOAD, FASTLOAD, BTEQ, TPUMP, Erwin, Informatica PowerCenter 8.6, Unix Scripting, VBA, MACROS, Windows XP

Confidential

Informatica Developer

  • Used Informatica client tools - Source Analyzer, Target designer, Mapping Designer, Mapplet Designer, and Transformation Developer for defining Source & Target definitions and designing the mappings of data flow from source system to data warehouse.
  • Logical and Physical design of databases using Erwin. Extensively used Erwin for data modeling and dimensional data modeling.
  • Implemented Slowly Changing Dimensions - Type III in mappings as per the requirements.
  • Used Informatica Workflow Manager and Workflow Monitor to schedule and monitor session status.
  • Used Debugger to troubleshoot the mappings.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Joiner, Aggregator, Update Strategy, Rank, Router, Lookup, Sequence Generator, Filter, Sorter, and Source Qualifier.
  • Implemented performance tuning logic on Targets, Sources, mappings, sessions to provide maximum efficiency and performance.
  • Defined Target Load Order Plan for loading data into Target Tables.
  • Prepared Unit Test Cases.
  • Set Standards for naming conventions and best practices for Informatica mapping development.
  • Environment: Informatica PowerCenter 8.6, Oracle, PL/SQL,Toad, Autosys,Unix, Erwin

Confidential

ETL Developer

  • Interacting with the end users to get all the incomplete requirements & developed client satisfied code
  • Performed Source System Data analysis as per the Business Requirement. Distributed data residing in heterogeneous data sources is consolidated onto Teradata staging using Informatica Power Center 8.3
  • Used heterogeneous data sources Oracle, DB2, and XML Files, Flat Files as source also imported stored procedures from Oracle for transformations.
  • Developed Mappings, Sessions, Workflows and Shell Scripts to extract, validate, and transform data according to the business rules
  • Identify the Fact tables and slowly changing dimensional (SCD) tables
  • Extensively used SQL tools like TOAD, Rapid SQL and Query Analyzer,run SQL queries to validate the data
  • Sourced the data from XML files, flat files, SQL server tables and Oracle tables
  • Made substantial contributions in simplifying the development and maintenance of ETL by creating re-usable Mapplets and Transformation objects. Partitioned the sessions to reduce the load time
  • Performed data cleansing and cache optimization
  • Change Data Capture for Incremental Aggregation
  • Involved in review of the mappings and enhancements for better optimization of the Informatica mappings, sessions and workflows
  • Extensively worked in performance tuning of programs, ETL procedures and processes
  • Performed Unit, Systems and Regression Testing of the mappings. Involved in writing the Test Cases and also assisted the users in performing UAT
  • Extensively used UNIX shell scripts to create the parameter files dynamically and scheduling jobs using Autosys
  • Created integration services, repository services and migrated the repository objects
  • Written PL/SQL procedures for processing business logic in the database
  • Provided production support and maintenance for all the applications with the ETL process
  • Environment: Informatica Power Center 8.3, Oracle 10g, Teradata, SQL, PL/SQL, TOAD,UNIX

Confidential

Associate ETL Developer

  • Interacted with the Business Users to map Requirements
  • Extensively worked on Power Center Client Tools like Repository Admin Console, Repository Manager, Designer, Workflow Manager, and Workflow Monitor
  • Analyzed the source data coming from different sources (Oracle, DB2, XML, QCARE, Flat files) and worked on developing ETL mappings
  • Developed complex Informatica Mappings, reusable Mapplets and Transformations for different types of tests in research studies on daily and monthly basis
  • Implemented mapping level optimization with best route possible without compromising with business requirements
  • Created Sessions, reusable worklets and workflows in Workflow Manager and Scheduled workflows and sessions at specified frequency
  • Worked on fixing invalid Mappings, testing of Stored Procedures and Functions, and Integration Testing of Informatica Sessions
  • Responsible for the Performance tuning at the Source Level, Target Level, Mapping Level and Session Level
  • Worked extensively on SQL, PL/SQL, and UNIX shell scripting
  • Generated XML files to deliver to Thompson Reuters
  • Performed Data profiling for data quality purposes
  • Proven Accountability including professional documentation, and weekly status report
  • Performed Quantitative and Qualitative Data Testing
  • Documented flowcharts for the ETL (Extract Transform and Load) flow of data using Microsoft Visio and created metadata documents for the Reports and the mappings developed and Unit test scenario documentation for the mentioned
  • Environment: Informatica Power Center 7.1.1, UNIX, Shell Scripting, Oracle 8 Enterprise Manager, SQL Profiler, DTS, PL/SQL.

Confidential

Investor Relations Officer

  • Work closely with associates to ensure that the targets are met and quality is maintained consistently.
  • Collaborate with management to gain the knowledge of specific work situations requiring associates to understand the changes in products and technologies.
  • Handle Escalation calls and Calls monitoring.
  • Conduct feedback sessions, One to One Sessions and Monthly meetings to ensure that the associates are posted with Individual and Team performance.
  • Take care of Back Office operations and Pay in & Payout.
  • Build customer relations through regular interactions, gathering information on all possible business potentials and achieve the proposed targets.
  • Provide reliable and honest investment advice and recommendations.

We'd love your feedback!