We provide IT Staff Augmentation Services!

Sr Informatica Developer Resume

4.00/5 (Submit Your Rating)

Atlanta, GA

PROFESSIONAL SUMMARY:

  • Professionally 10+ years of IT Experience in Data Warehousing, Database Design and ETL Processes in various business domains like finance, telecom, manufacturing and health care industries.
  • Highly proficient in Development, Implementation, Administration and Support of ETL processes for Large - scale Data warehouses using Informatica Power Center.
  • Worked extensively on ETL process using Informatica PowerCenter 9.x/8.x/7.x .
  • Extensively used ETL methodologies for supporting Data Extraction, Transformation and Loading process, in a corporate-wide-ETL solution using Informatica Power Center.
  • Extensively worked on developing Informatica Designer, Workflow manager and Workflow monitor for data loads.
  • Experience working with Cloud Computing on Platform Salesforce.com
  • Extensive experience in using various Informatica Designer Tools such as Source Analyzer, Transformation Developer, Mapping Designer, Mapplet Designer.
  • Extensive experience in Design, Development, Implementation, Production Support and Maintenance of Data Warehouse Business Applications in E-commerce software , Utility, Pharmaceutical, Health Care, Insurance, Financial and Manufacturing industries.
  • Experience in development and maintenance of SQL, PL/SQL, Stored procedures, functions, analytic functions, constraints, indexes and triggers.
  • Extensive experience delivering Data warehousing implementations, Data migration and ETL processes to integrate data across multiple sources using Informatica PowerCenter and Informatica Cloud Services.
  • Experienced in IDQ (9X, 9.5.1) handing LDO’s PDO’s & some of the transformation to cleanse and profile the incoming data by using Standardizer, Labeler, Parser, Address Validator Transformations 8 years of experience in using different versions of Oracle database like 11g/10g/9i/8i.
  • Excellent working knowledge of c shell scripting, job scheduling on multiple platforms, experience with UNIX command line and LINUX.
  • Experience in ETL development process using Informatica for Data Warehousing , Data migration and Production support .
  • Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Server and Parallel jobs using Data Stage to populate tables in Data Warehouse.
  • Involved in the project which maintains the data of commercial, property and casualty insurance.
  • Experience in both Waterfall and Agile SDLC methodologies.
  • Experience with TOAD, SQL Developer database tools to query, test, modify, analyze data, create indexes, and compare data from different schemas.
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE).
  • Worked on Slowly Changing Dimensions (SCD's) Types -1, 2 and 3 to keep track of historical data.
  • Knowledge in Data Analyzer tools like Informatica Power Exchange (Power Connect) to capture the changed data.
  • Proficiency in data warehousing techniques for data cleansing, surrogate key assignment and Change data capture (CDC).
  • Experience in integration of various data sources like Oracle, DB2, Flat Files and XML Files into ODS and good knowledge on Teradata 12.0/13.0, SQL Server 2000/2005/2008 and MS Access 2003/2007.
  • Expertise in implementing complex business rules by creating re-usable transformations, Mapplets and Mappings.
  • Optimized the Solution using various performance-tuning methods (SQL tuning, ETL tuning (i.e. optimal configuration of transformations, Targets, Sources, Mappings and Sessions), Database tuning using Indexes, partitioning, Materialized Views, Procedures and functions).
  • Extensively used Autosys and Tidal for scheduling the UNIX shell scripts and Informatica workflows.
  • Extensive knowledge in all areas of Project Life Cycle Development.
  • Strong analytical, verbal, written and interpersonal skills.

TECHNICAL SKILLS:

Databases: Oracle 10g/9i/11i/R12, DB2, MS SQL server 7.0/2000/2005/2008 , MS Access 2000/2005, Teradata, MySQL, Azure SQL.

OLAP/Reporting Tools: SQL Server Analysis Service (SSAS), SQL Server Reporting Service (SSRS)

ETL Tools: Informatica PowerCenter 10.x/9.x/8.x/7.x, Informatica cloud, Informatica Power Exchange, Informatica Data Quality Suite 9.6, SQL Server Integration Services (SSIS)

SQL Server Tools: SQL server Management Studio

Other Tools: Microsoft Office, Visual Basic 6

Scheduling Tools: Tidal, Autosys, Windows Scheduler

Data Quality Tools: Informatica Analyst, Informatica Data Quality, Informatica Developer

PROFESSIONAL EXPERIENCE:

Confidential, Atlanta, GA

Sr Informatica Developer

Responsibilities:

  • Worked on requirements gathering, architecting the ETL lifecycle and creating design specifications, ETL design documents.
  • Identified and eliminated duplicates in datasets through Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Responsible for Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Schedule the workflows to pull data from the source databases at weekly intervals, to maintain most current and consolidated data.
  • Developed Mapplets, Reusable Transformations, update strategy, router, look up, expression, aggregator transformations, Source and Target definitions, mappings using Informatica 10.0.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
  • Involved in converting specifications to data mapping in an ETL Informatica Cloud environment.
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud.
  • Convert specifications to programs and data mapping in an ETL Informatica Cloud environment.
  • Used various sources to pull data into Power BI such as SQL Server, Oracle, SQL.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Continuous review of Business critical Database to proactively identify space issues, performance tuning issues
  • Involved in migration projects to migrate data from data warehouses on Oracle/DB2 and migrated those to Teradata.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, XML, SQL server and Flat files and loaded into Oracle.
  • Work on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Oracle.
  • Provided extensive Production Support for Data Warehouse for internal and external data flows to Netezza, Oracle DBMS from ETL servers via remote servers.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe systems.
  • Design the Source - Target mappings and involved in designing the Selection Criteria document.
  • Wrote BTEQ scripts to transform data. Used Teradata utilities fast load, multiload, tpump to load data
  • Experienced working with AZURE SQL data ware house.
  • Designed and Developed Data stage Jobs to Extract data from heterogeneous sources, applied transform logics to extracted data and Loaded into Data Warehouse Databases.
  • Developed Informatica process to replace stored procedure functionalities and provide a time effective and high data quality application to the client.
  • Performed transfer and loading of files to DB2
  • Design, Development, Testing and Implementation of ETL processes using Informatica Cloud
  • Worked on all types of transformations that are available in Power bi query editor
  • Excellent knowledge on ETL tools such as Informatica to load data to Teradata by making various connections to load and extract data to and from Teradata efficiently.
  • Used the Data Stage Director and its run-time engine to schedule running the solution, testing and debugging its components, and monitoring.
  • Transferred data from various data sources/business systems including DB2, MS Excel, MS Access, Flat Files etc. to SQL Server using SSIS Packages and using various features like Excel source, Flat file source, transformation etc. Created derived columns from the present columns for the given requirements.
  • Analyze the business requirement and create ETL logic to extract data from flat files coming from Manufacturing at different geographic regions and load data in the data warehouse house.
  • Using Tableau desktop for the dashboard development activities such as rate and calculations, hierarchies, filters, actions. Defining the various indicators, gauges for the various KPIs and measures with various charts and graphs.
  • Created Data maps and Extraction groups in Power exchange.
  • Verified if the data model helps in retrieving the required data by creating data access paths in the data model.
  • Involved in design, implement and perform administration functions for Enterprise data integration environments including Informatica PowerCenter and Informatica Data Quality (IDQ)
  • Proficient in Creating, Configuring and Fine-tuning ETL workflows designed in MS SQL Server Integration Services (SSIS).
  • Experienced in providing Logging, Error handling by using Event Handlers, and Custom Logging for SSIS Packages.
  • Proficient in designing and scheduling complex SSIS Packages for transferring data from multiple data sources to SQL server.
  • Prepared ETL Specifications and design documents to help develop mappings.
  • Worked on staging the data into worktables, cleanse, and load it further downstream into dimensions using Type 1 and Type 2 logic and fact tables which constitute the data warehouse.
  • Worked with PMCMD to interact with Informatica Server from command mode and execute the shells scripts.
  • Project based on Agile SDLC methodology with 2 weeks of software product release to the business users.
  • Take part in daily standup and scrum meetings to discuss the project lifecycle, progress and plan accordingly, which is the crux of Agile SDLC.
  • Provide post release/production support.

Environment: Informatica Power Center 10.0, Informatica cloud, Power exchange Data stage, PL/SQL, Oracle Database 11g, SQL server, AWS Redshift, Toad for Oracle, Autosys, Unix Shell scripts, Teradata, Azure SQL.

Confidential, Boston, MA

ETL / Informatica Developer

Responsibilities:

  • Used Informatica Power Center v9.6 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Productively used Informatica tools - Informatica Repository Manager, Informatica Power Center Designer, Informatica Workflow Manager and Informatica Workflow Monitor.
  • Using the concept of Slowly Changing Dimensions, complex mappings were created. Involved implementation of Business logic and capturing the deleted rows in the source origination.
  • Worked extensively with the connected lookup transformations with the dynamic cache enabled.
  • Developed several reusable transformations and Mapplets, which were used in other mappings.
  • Created sessions and extracted data from various sources.
  • Transformed the data according to the requirement and loading into the data warehouse.
  • Involved in creating various mappings in the Informatica power center designer.
  • Used Informatica Power Center for extractions, transformation, and loading data from heterogeneous sources into the target databases.
  • Responsible for optimization of SQL queries, T-SQL and SSIS Packages.
  • Modeled new data models based on user requirements and updating existing data model. Created new metadata and updating existing metadata.
  • Worked on the Teradata stored procedures and functions to confirm the data and have load it on the table.
  • Involved in requirement analysis, ETL design and development for extracting data from the source systems like sales force, Mainframe, DB2, sybase, Oracle, flat files and loading into Netezza.
  • Worked extensively on different complex mappings using transformations like Source qualifier, expression, Router, filter, update strategy, Connected and Un-connected lookup, Normalizer, joiner, Aggregator, Update strategy.
  • Designed and developed stored procedures, queries and views necessary to support SSRS reports.
  • Designed several Processes on Informatica Cloud and exposed them as RESTful API services to publish data to external systems
  • Provided tuning recommendations and future memory requirements to Primary DBA team to make the changes in Database like table reorg, add enough spaces to database.
  • Using SQL Server Integration Services (SSIS) to populate data from various data sources
  • Used DB2 tools to migrate database objects, to monitor thread activity.
  • Used highly complex TSQL Queries and SQL Scripts to perform efficient data load based on complex Business rules.
  • Extensively used TSQL to manipulate and architect to produce results for users. Used multiple join statements to retrieve data from multiple tables
  • Used Power Exchange to extract DB2 Source from mainframe server.
  • Extracted Data from different Source Systems like Oracle, Teradata and Flat files.
  • Designed and created data extracts, supporting SSRS, POWER BI.
  • Experienced in optimizing the SQL queries to improve the performance.
  • Prepared the complete data mapping for all the migrated jobs using SSIS.
  • Worked on SSRS and delivered complex reports from different data sources such as SQL Database.
  • Experience on Data Analysis for source and target systems and good understanding of Data Warehousing concepts, Dimensions, Facts and Star, Snowflake Schemas and ER modeling.
  • Used ETL (SSIS) to develop jobs for extracting, cleaning, transforming and loading data into data warehouse.
  • Responsible for Unit testing, System and Integration testing.
  • Transferred data from various data sources/business systems including DB2, MS Excel, MS Access, Flat Files etc. to SQL Server using SSIS Packages and using various features like Excel source, Flat file source, transformation etc. Created derived columns from the present columns for the given requirements.
  • Involved in parsing and handing structured and unstructured data like json, no-sql.
  • Setting up single sign on for AWS Redshift using active directory. Setting up Redshift integration with informatica.
  • Worked with JSON schema to call webservices and testing webservices.
  • Extensively Used debugger in identifying bugs in existing mappings by analyzing data
  • Identified and fixed performance bottlenecks and tuned the Informatica mappings for better Performance.
  • Extensively used Informatica debugger to figure out the problems in mapping. Also involved in troubleshooting existing ETL bugs.
  • Analyzed Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.

Environment: Informatica Power 9.6 (Workflow Manager, Workflow Monitor, Mapplets), Oracle, T-SQL, Teradata, SQL.

Confidential, St. Louis, MO

Sr Informatica Developer

Responsibilities:

  • Analyze the business requirements and framing the Business Logic for the ETL Process and maintained the ETL process using Informatica Power Center.
  • Created complex Informatica mappings to load the data mart and monitored them. The mappings involved extensive use of transformations like Aggregator, Filter, Router, Expression, Joiner, Union, Normalizer and Sequence generator.
  • Develop an ETL Informatica mapping in order to load data into staging area. Extracted from Mainframe files and databases and loaded into Oracle 11g target database.
  • Create workflows and work lets for Informatica Mappings.
  • Work on SQL coding for overriding for generated SQL query in Informatica.
  • Involve in Unit testing for the validity of the data from different data sources.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes and functions. Experience dealing with partitioned tables and automating the process of partition drop and create in oracle database.
  • Involve in migrating the ETL application from development environment to testing environment.
  • Perform data validation in the target tables using complex SQLs to make sure all the modules are integrated correctly.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Writing T-SQL statements for retrieval of data and Involved in performance tuning of TSQL
  • Designed SSIS Packages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • Worked with Informatica toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities.
  • Perform Data Conversion/Data migration using Informatica PowerCenter.
  • Extensively used SSIS transformations such as Lookup, Derived column, Data conversion, Aggregate, Conditional split, SQL.
  • Involve in performance tuning for better data migration process.
  • Analyze Session log files to resolve error in mapping and identified bottlenecks and tuned them for optimal performance.
  • Create UNIX shell scripts for Informatica pre/post session operations.
  • Review all the development queries and performed optimization and query performance tuning using various techniques for Netezza
  • Created Complex ETL Packages using SSIS to extract data from staging tables to tables with incremental load.
  • Worked on Direct Connect process to transfer the files between servers.
  • Document and present the production/support documents for the components developed, when handing-over the application to the production support team.
  • Worked with XML targets for the data coming from SQL server source.
  • Query tuning and SQL Query override used in Source Qualifier transformation to pull historical data from database not earlier than the given date i.e. the change data capture (CDC).
  • Parameterized the whole process by using the parameter file for the variables.
  • Imported xsd file to create the xml target and create the Hierarchical Relationship
  • And normalized views.
  • Implemented the logic by using HTTP transformation to query the web server.
  • Created complex Shell scripts for various set of actions that would automate the process of executing the actions like validating the presence of indicator files.
  • Pushing the compressed and encrypted xml files and flat files generated to the external vendor using MFT.
  • Involved in Unit testing and system integration testing (SIT) of the projects.
  • Assist the team members with the mappings developed as part of knowledge transfer.

Environment: Informatica PowerCenter 8.6.1/ 8.1.1 , Windows Server 2008, MS-SQL Server 2005, T-SQL, Batch Scripting, Perl Scripting, XML Targets, Flat Files,), Tidal 5.3.1. UNIX.

Confidential

ETL Developer/Analyst

Responsibilities:

  • Involved in business analysis and technical design sessions with business and technical staff to develop
  • Requirements document and ETL specifications.
  • Involved in designing dimensional modeling and data modeling using Erwin tool.
  • Created high-level Technical Design Document and Unit Test Plans.
  • Developed mapping logic using various transformations like Expression, Lookups (Connected and Unconnected), Joiner, Filter, Sorter, Update strategy and Sequence generator.
  • Wrote complex SQL override scripts at source qualifier level to avoid Informatica joiners and Look-ups to improve the performance as the volume of the data was heavy.
  • Responsible for creating workflows. Created Session, Event, Command, and Control Decision and Email tasks in Workflow Manager
  • Prepared user requirement documentation for mapping and additional functionality.
  • Extensively used ETL to load data using Power Center from source systems like Flat Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Analyzed current system and programs and prepared gap analysis documents
  • Experience in Performance tuning & Optimization of SQL statements using SQL trace
  • Involved in Unit, System integration, User Acceptance Testing of Mapping.
  • Supported the process steps under development, test and production environment

Environment: Informatica Power Center 8.1.4/7.1.4 , Oracle 10g/9i, TOAD, Business Objects 6.5/XIR2, UNIX, clear case.

Confidential

ETL Developer

Responsibilities:

  • Development, Testing, Implementation & of users.
  • Prepared ETL Specifications and design documents to help develop mappings.
  • Created mappings for Historical and Incremental Loads.
  • Used Version Control to check in and checkout versions of objects.
  • Tuned Source, Target, Mappings, Transformations and Sessions for better performance.
  • Supporting daily loads and work with business users to handle rejected data.
  • Prepared and maintained mapping specification documentation.
  • Use Debugger to test the mapping and fixed the bugs.
  • Used Mapping Variables and Mapping Parameters to fulfill the business requirements.
  • Implemented Type I and Type II slowly changing Dimension to maintain all historical information in Dimension Tables.
  • Involved to prepare technical and functional specification.
  • Performance analysis and tuning of SQL statements of projects.
  • Imported data from different sources to Oracle tables using Oracle SQL*Loader
  • Wrote Oracle PL/SQL, stored procedures and triggers to populate data.
  • Involved in writing complex SQL statements for reports.
  • Worked closely with client to research and resolve user testing issues and bugs.

Environment: Informatica PowerCenter, ETL, UNIX, PL/SQL, TOAD, Oracle 8i, SQL, SQL*Loader

We'd love your feedback!