We provide IT Staff Augmentation Services!

Application Support Developer - Etl Informatica Resume

4.00/5 (Submit Your Rating)

SUMMARY

  • Around 6 years of work experience in designing and developing Data warehouse applications with special emphasis on Design and Development of Data Warehousing using Informatica Power Center 10.1/9.5/9.1/8.6. Expert in all phases of Software development life cycle(SDLC) - Project Analysis, Requirements, Design Documentation, Development, Unit Testing, User Acceptance Testing, Implementation, Post Implementation Support and Maintenance.
  • Extensively worked in ETL process consisting of data transformation, data sourcing, mapping, conversion and loading.
  • Strong experience with Informatica client tools - Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Transformations Developer, Workflow Manager/Monitor and Server tools - Informatic Server, Repository Server manager.
  • Strong expertise in Relational data base systems like Oracle, SQL Server, MS Access. Strong knowledge of writing simple and Complex Queries in various databases like Oracle, Teradata, My SQL, DB2 and SQL Server.
  • Good experience in creating and using Stored Procedures, Functions, Triggers, Views and packages in Different databases.
  • Extensive experience in data profiling, data migration from various legacy sources and relational systems to OLAP and decision support target systems.
  • Good knowledge on Informatica IDQ. Proficient inRelational Database Management System(RDBMS).
  • Having knowledge on Master Data Management MDM Data Integration concepts in large scale implementation environments. Proficiency in developing complex mappings using Informatica and Data Cleansing, slowly changing dimensions (SCD) types 1&2.
  • Database experience using Oracle 11g/10g/9i, MS SQL Server 2000 and MS Access. Having good Knowledge of Informatica administration in windows and Linux environment.
  • Practical knowledge of Data warehouse concepts, Data modeling principles - Star Schema, Snowflake, Normalization/De-normalization.
  • Expertise in configuration, performance tuning, installation of Informatica, & in integration of various data sources like Oracle, DB2, flat files into the staging area and Design ETL processes that span multiple projects.
  • Experience in working in a Multi-Dimensional Warehouse projects. Experience in reading and loading high-volume Type 2 dimensions by implementing SCD (Slowly Changing Dimensions).
  • Expertise in OLTP/OLAP System Study, Understanding E-R modelling and Dimensional Models using Star schema and Snowflake schema techniques.
  • Highly proficient in processing tasks, scheduling sessions, import/export repositories, manage users, groups, associated privileges and folders.
  • Proficient in creating UNIX Shell Scripts. Excellent communication skills and co-ordination with project managers, business analysts, architects, DBA’s and Developers.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.1/9.6/9.5/9.1/8.6 , Informatica Developer Data Quality, Informatica Intelligent Cloud Serves (IICS)

Databases: Oracle 11g/10g/9i, DB2, MS SQL Server (2012, 2008, 2005), Neteeza, SAP, RedShift, Athena, SAP Hana S/4, SAP ECC, Python SQLite3

Other Tools: Oracle Report Writer, TOAD, MLOAD, Erwin 4.x/3.5/3.x, SQL, XML, XSL,, SQL*Loader and Developer, PyCharm, Jupyter Notebook, Tableau

Languages: SQL, PL/SQL, UNIX Shell Scripting, Unix Commands, .NET, Visual Basic

Scheduling tools: Autosys, Control-M, Cisco Tidal, UC4 Scheduler

Programming Languages: Python 3.0, PL/SQL, SQL, T-SQL, Unix, .Net, ASP .NET

Operating Systems: Windows Server 2008/2003, UNIX, LINUX.

PROFESSIONAL EXPERIENCE

Confidential

Application Support Developer - ETL Informatica

Responsibilities:

  • Using Informatica PowerCenter 9.6 version for ETL jobs to load the data from Oracle 11g to Oracle 11g since both source and target systems are same.
  • Working with business users and functional owners in getting the requirements and create a document for the requirement and present it to the business by giving the timelines for development and get the approvals after timeline negotiations and start the development process.
  • Using WinSCP, PuTTY for parameter files, Scripts, and file transfers in the internal servers.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets. Developed mappings to load data using SCD Type 1, SCD Type 2 dimensions.
  • Writing SQL queries and strong Design / Development Experience on Amazon Redshift, AWS Athena.
  • Extensively performing SQL queries on Oracle database to perform CRUD and handle the data.
  • Designing, Developing, testing, implementing and troubleshooting ETL mappings and workflows in a large Datawarehouse environment using Informatica PowerCenter 9.6x version.
  • AWS Glue Pipeline knowledge to develop ETL for data movement to Redshift with experience to map the source to target rules and fields.
  • Writing SQL queires in Python 3.0 SQLite3 using PyCharm to handle the data through Python framework.
  • Writing ETL Design documentation and maintaining overall design documents and deliveries.
  • Performing ETL works on CDW from the operational system Salesforce using API’s through Oracle and Teradata.
  • Creating reusable transformations in Mapplet designer using Informatica PowerCenter Designer.
  • Scheduling the Informatica workflows in UC4 scheduler tool to automate the informatica jobs.
  • Handing over all the workflows to the production team after the Unit Testing, UAT and SIT’s and supporting the production team by giving them the KT documents with mapping and workflow information.

Environment: Informatica PowerCenter 9.6, 10.2x(upgraded), WinSCP, PuTTY, Oracle 11g, UC4 Scheduler, AWS Redshit Teradata, Salesforce, Tableau.

Confidential

Informatica - SAP Integration Specialist - ETL Informatica

Responsibilities:

  • Using Informatica Power Center 10.1/10.2 versions for ETL jobs, loading the data from heterogeneous source systems to target database.
  • Working with Business Users and Functional Owners in getting the requirements and completing the development according to the user request and functional owner’s specifications.
  • Troubleshooting the existing mappings, workflows, debugging the technical issues in Informatica and resuming the failed jobs.
  • Using WinSCP, PuTTy for file transfers in the internal servers.
  • Designing and developing several complex mappings using various transformations like Source Qualifier, Aggregator, Router, Joiner, Union, Expression, Lookup (Connected & unconnected), Filter, Update Strategy, Stored Procedure, Sequence Generator and used reusable transformations as well as mapplets.
  • Using different databases to Extract, Transform and Load the data like SQL, Netezza, Oracle and SAP.
  • Working with Workflow Manager for the creation of various tasks like Worklets, Sessions, Batches, E-mail notifications, Decision and to Schedule jobs.
  • Extensivly using SQL to perform queries on EDW’s, in different databases like Netezza, SQL Server and Oracle.
  • Working on reading and writing the data from SAP databases via Informatica Power Center using the file formats called iDOC’s.
  • Using Tidal tool as a scheduler to schedule the jobs according the user’s requests.
  • Supporting the Functional Unit Testing and Testing before releasing the mappings to the final testing team.
  • Creating and loading the dimensional tables, views in the databases which are linked through the SAP databases, which makes the informatica easy to read and write the data into SAP databases via relational databases.

Environment: Informatica PowerCenter 10.1/10.2, WinSCP, Unix, Oracle 10g, Netezza, SQL server management studio, SAP Hana S/4, SAP ECC, Flat Files, XML/XSD’s, Windows 10.

Confidential

Informatica - SAP Integration Specialist - ETL Informatica

Responsibilities:

  • Move the data from each disparate source into a central stage environment of SAP S4, to process the conversion data.
  • Convert flat files, XML’s, RDBMS Sources and different data types from each of the source environment into a central repository (SAP S4).
  • Extensivly using SQL to perform queries against Netezza databases which is basically used as a staging area for all conversion activities and integration activities.
  • Performing complex SQL overrides using Source Qualifier in Informatica to increase the performance of a mapping.
  • Gather and analyze data to solve complex problems and evaluate scenarios to make predictions on future outcomes and support decision making
  • Convert Coach, Inc (Sources: Flat Files, RDBMS, SAP Hana) data into Kate Spade format and load into SAP S4 to form a Data Warehouse of Confidential which includes Coach, Kate Spade and Stuarts Weitzman in it.
  • Convert Stuarts Weitzman (Sources: My SQL Server, Flat Files) data into Kate Spade and Coach format to load into SAP S4 to form a Data Warehouse of Confidential which includes Coach, Kate Spade and Stuarts Weitzman in it.
  • Using Informatica Power Center v10.1 to implement the ETL Processes and complete the Data Conversion requirements from business and using Netezza as a staging area for all conversion works.
  • Performed extraction, transformation and loading of data from database tables and Flat File sources into Oracle 11g RDBMS in accordance with requirements and specifications.
  • Experience in Migrating the existing SAP ERP Retail systems into an SAP S4 ERP Systems (which is an advanced system to monitor the ERP processes and Logistics to handle the business more proficient) by using a staging area to complete the data cleansing, masking, mining and converting processes.

Environment: Informatica Power Center 10.1/10.2, WinSCP, Unix, Oracle 10g, Netezza, SQL server management studio, SAP Hana S/4, SAP ECC, Flat Files, XML/XSD’s, Windows 10.

Confidential

Data Warehouse Engineer - ETL

Responsibilities:

  • Interacted with functional/end users to gather requirements of core reporting system to understand exceptional features users expecting with ETL and Reporting system and to successfully implement business logic.
  • Understood the Business point of view to implement coding using Informatica power center designer . Used Informatica Power Center 10.1 for retrieving data from Mainframe system.
  • Involved in development of ETL loads and Business intelligence reports and analysis, providing mentoring, guidance and troubleshooting to analysis team members in solving complex reporting and analytical problems.
  • Debugging invalid mappings using breakpoints, testing of stored procedures and functions, testing of Informatica sessions, batches and the target Data.
  • Performing requirement gathering analysis design development testing implementation support and maintenance phases of both MDM and Data Integration Projects.
  • Identified sources, targets, mappings and sessions and tuned them to improve performance. Created and scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.
  • Developed SQL and PL/SQL scripts to validate and load data int interface tables using SQL Loader. Performed strategies for incremental data extraction as well data migration to load data into Teradata.
  • Developed ETL programs using Informatica to implement the business requirements. Involved in data modeling and design of data warehouse in star schema methodology with confirmed granular dimensionsand Fact tables.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
  • Utilized Informatica Data Quality (IDQ) for data profiling and matching/removing duplicate data, fixing the bad data, fixing the null values.
  • Implemented different Mappings with the collection of all Sources, Targets, and Transformations using Designer.
  • Implemented Conceptual, Normalized/Denormalized Logical and Physical data models to design OTLP and Data warehouse environments.
  • Designed and developed mapplets and re-usable transformations, and used them in different mappings.
  • Experience developing and supporting complex DW transformations. Creating Reusable Transformations and Mapplets in a mapping.
  • Translate story boards and metrics into technical requirements from which to develop insightful solutions
  • Used ETL process to load data from multiple sources to staging area (Oracle 10g) using Informatica Power Center 10.1. Created Mappletsand used them in different Mappings.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Performance tuning using Informatica partitioning. Involved in Data Base Tuning. Written UNIX ShellScripts for various purposes in the project.

Environment: Informatica Power Center 10.1, Oracle 10g, MS SQL SERVER 2000, Teradata, SQL, PL/SQL, SQL*Loader, UNIX Shell Script, Tableau 2018, Workflow Manager, Power Center Designer, Oracle 11g, Autosys.

We'd love your feedback!