We provide IT Staff Augmentation Services!

Iics Developer Resume

3.00/5 (Submit Your Rating)

San Antonio, TX

SUMMARY

  • 5 Years of experience in Information Technology, Data Warehousing, and ETL process using Informatica Power Center/ /Informatica Cloud/Power Exchange as well with Teradata.
  • Experience with Software Development Life Cycle (SDLC) and Waterfall, Agile methodology.
  • Experience in the Analysis, Design, Development, Testing and Implementation phases of Business Intelligence solutions using ETL tool Informatica Power Center (Repository Manager, Designer, Workflow Manager, and Workflow Monitor).
  • Experience in data warehouse techniques and practices, experience including ETL processes, dimensional data modeling (Star Schema, Snowflake Schema, FACT and Dimension Tables), OLTP, and OLAP.
  • Having strong hands on experience in extraction of the data from various source systems like Oracle, Oracle GG, Teradata, DB2, My Sql, Sql Server, SFDC, Flat Files and XML.
  • Proficient in the development of Extract, Transform and Load processes with a good understanding of source to target data mapping, ability to define and capture Meta data and Business rules.
  • Experience in Data Analysis, Data Mapping, Data Modeling, Data Profiling and development of Databases for business applications and Data warehouse environments.
  • Experience in creating Transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Update Strategy, Sequence Generator, Normalizer and Rank) and Mappings using Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Experience in writing Stored Procedures, Package, Functions, Triggers, Views and Materialized Views using SQL and PL/SQL.
  • Strong experience in Enterprise Data Warehouse environment with Informatica & Teradata combination.
  • Hands on experience with Teradata utilities likeBTEQ, Fast Export, Fast Load, and Multi Load, Tpump & TPT to export and load data to/from different source systems including flat files.
  • Written Teradata control scripts using M - Load, F-Load & F-Export to load & unload the data.
  • Used Teradata Utilities in Combination with Informatica as Loader connections.
  • Written BTEQs to move data from TD Staging to Base Tables.
  • Used Explain Plan for Teradata Performance Tuning, Used Better techniques with PI, PPI & Collect Stats.
  • Used different Teradata tables Derived, Volatile & GTT in BTEQs.
  • Data integration with SFDC using Informatica cloud
  • Generated flat files using Informatica which are used as Index files to expose them to ELASTIC search.
  • Developed Informatica workflows to insert new data, update existing data and delete data from Index files in ELK stack
  • Proficiency in Data Warehousing techniques for Data Cleaning, Slowly Changing Dimension (SCD) phenomenon, Surrogate Key assignment and Change Data Capture (CDC).
  • Experience in supporting and performing Unit testing, System testing, Integration testing, UAT and production support for issues raised by application users.
  • Experience in performance Tuning, identifying and resolving performance bottleneck in various levels like sources, targets, mappings and sessions.
  • Excellent communication and a very good team player and self-starter with ability to work independently and as part of a team.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center10.2/9.6.1, IICS informatica cloud

Reporting Tools: OBIEE, Micro Strategy

Scheduling Tools: Control M, Auto Sys, Maestro, Cron TabInformatica Scheduler

Databases: Teradata 15/14/13/12, My Sql, Oracle, DB2, Sql Server

Modeling: Dimensional Modeling, ER Modeling

Programming Languages: SQL, PL/SQL, Unix Shell Scripting

Web Technologies: HTML, XML

Methodologies: Agile, Waterfall

Operating Systems: Unix, Windows

PROFESSIONAL EXPERIENCE

Confidential, San Antonio, TX

IICS Developer

Responsibilities:

  • Analyzed the business requirements and functional specifications.
  • Developed ETL programs using Informatica Intelligent Cloud Services to implement the business requirements.
  • Responsible to analyze UltiPro Interfaces and Created design specification and technical specifications based on functional requirements/Analysis.
  • Designed, developed and maintained Snowflake database objects (tables, views, stored procedures, etc.) and SQL scripts.
  • Created mappings using Informatica Cloud transformations Source and target, Data mask, Expression, Filter, Hierarchy Builder, Hierarchy parser, joiner, lookup, Router, sequence Generator, sorter to load the data from Salesforce to Databases.
  • Developed Out bound Messaging Queue for Salesforce to capture Real time updates.
  • Created various API connection based on SOAP,REST and REST V2
  • Created real time API’s using Rest, Soap and web services exposed by Applications.
  • Built Processes using Application Integration Service Connectors, Connections, process objects, sub-process.
  • Used Rest connection to access Vision Link API which gives the location, work hours of the machine and developed a job to give a list of all the machines for Preventive maintaince service in next 30 days.
  • Migrated power center 10 mappings into Informatica cloud
  • Developed API service connectors and Configured App Connections over the API serves in Application Integration module of IICS to create process to sync salesforce to DBS System real time.
  • Created API service connectors to connect to API, configured various actions in the service connectors to perform the different actions on the API based on the end point URL.
  • Created Process which will be invoked by salesforce outbound messaging mechanism to send the real time updates to DBS system.
  • Developed audit, parameterization, error logging frame work in Informatica cloud for better tracking of the data.
  • Developed Informatica cloud mapping to load all the data from SFDC cloud downstream to data ware house which is on Snowflake cloud.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Created tasks, sequential and concurrent batches for proper execution of mappings using task flow.
  • Documented describe programs development, logic, coding, testing, changes and corrections.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Followed Informatica recommendations, methodologies and best practices.
  • Worked for resolving different ETL production tickets with Tier 1 team.
  • Worked solving data issues in DWH raised by end users using Trouble tickets based on severity.

Environment: Informatica Intelligent cloud services, Data Analysis (SQL, IDQ), Oracle 12c/11g, MS-SQL Server, Sql Assistant, Toad, HP Quality Center, Jira, Unix, Windows, salesforce, SOQL, Postman, Azure, Snowflake, T-SQL.

Confidential, Orlando, FL

Sr. ETL Informatica Developer

Responsibilities:

  • Worked closely with the business teams to understand the requirements along with Business Analyst & ETL Architect Team.
  • Involved in the discussions with data modelers to review the design of the data model and to create database tables.
  • Involved in creating the technical design documents along with Business Analyst and design walk through with data architects.
  • Used Informatica Power Center 10.2 /9.6.1 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Developed IICS Informatica Cloud mappings to extract the data from SFDC.
  • Creating IICS Mappings Tasks/Data Replication Tasks/Data Synchronization tasks.
  • Created Workflow tasks in IICS Informatica Cloud
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used Salesforce API connection to load and retrieve data to Salesforce.
  • Developed Salesforce triggers which are used for data cleansing in the cloud. These triggers are run on daily basis based on the data fed to it as a part of data Integration.
  • Created sessions, sequential and concurrent batches for proper execution of mappings using workflow manager.
  • Wrote Unix Scripts to create flat file outputs and also collect flat files from the server and load then to a database.
  • Extensively used Informatica debugger to figure out the problems in mapping.
  • Worked with the Salesforce team to frame validation rules on the Salesforce object to keep data in clean in the cloud.
  • Involved in troubleshooting existing ETL bugs.
  • Created a list of the inconsistencies in the data load on the client side to review and correct the issues on their side.
  • Worked with business and formulated them into requirement documents and documented the source to target data flow.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
  • Documented describe programs development, logic, coding, testing, changes and corrections.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Followed Informatica recommendations, methodologies and best practices.
  • Worked for resolving different ETL production tickets with Tier 1 team.
  • Worked solving data issues in DWH raised by end users using Trouble tickets based on severity.

Environment: Informatica Power Center 10.1/9.6.1, IICS, Oracle 12c/11g, MS-SQL Server, MainFrames, Teradata15/14, Sql Assistant, Putty, Win Scp, Sql Assistant, Toad, HP Quality Center, Jira, Unix, Windows, SOQL, Salesforce Marketing Cloud, T-SQL.

Confidential, Dallas, TX

Sr. ETL Informatica/TeradataDeveloper

Responsibilities:

  • Analyzed the business requirements and functional specifications.
  • Developed ETL programs using Informatica to implement the business requirements.
  • Communicated with business customers to discuss the issues and requirements.
  • Extracted data from different sources and staged into a single place and applied business logic to load them in the DWH.
  • Used Informatica Power Center 10.2/9.6.1 for extraction, transformation and load (ETL) of data in the data warehouse.
  • Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression and Lookup, Update strategy and Sequence generator and Stored Procedure.
  • Developed complex mappings in Informatica to load the data from various sources.
  • Implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Parameterized the mappings and increased the re-usability.
  • Used Informatica Power Center Workflow manager to create sessions, workflows and batches to run with the logic embedded in the mappings.
  • Extensively used Toad utility for executing SQL scripts and worked on SQL for enhancing the performance of the conversion mapping.
  • Used Teradata loaders to load data in to Staging Area.
  • Used Teradata Scripts - BTEQ to load data from Stage to Base.
  • Extensively worked with Teradata Utilities - BTEQ, M-LOAD, F-Load, T-Pump& TPT.
  • Extensively used Informatica debugger to figure out the problems in mapping.
  • Involved in troubleshooting existing ETL bugs.
  • Created a list of the inconsistencies in the data load on the client side to review and correct the issues on their side.
  • Created the ETL exception reports and validation reports after the data is loaded into the warehouse database.
  • Documented describe programs development, logic, coding, testing, changes and corrections.
  • Created Test cases for the mappings developed and then created integration Testing Document.
  • Followed Informatica recommendations, methodologies and best practices.
  • Worked for different ETL failures in production with Tier 1 team.

Environment: Informatica Power Center 10.1/9.6.1,Oracle 12c/11g, MS-SQL Server, Main Frame, Teradata15/14, Sql Assistant, BTEQ, M-Load, F-Load, F-Export, TPT, Putty, Win Scp, Sql Assistant, Toad, HP Quality Center, Jira, Unix, Windows, T-SQL.

Confidential

ETL Informatica/Teradata Developer

Responsibilities:

  • Analyzing and understanding the end user requirements and business rules.
  • Identify all potential issues during the requirement understanding phase and to describe actions to address those issues.
  • Done the impact analysis in terms of schedule changes, dependency impact, code changes for various change requests on the existing Data Warehouse applications that running in a production environment.
  • Identify the database objects like PL/SQL Procedures, tables, and views, which will be impacted as part of requirement.
  • Identify scripts like Bteq, MLoad, FLoad and TPT, which will be impacted as part of particular project requirement.
  • Coordinating and delegating the work to other team members.
  • Responsible for Technical implementation of the change request solution.
  • Supporting enterprise data warehouse and ETL development activities.
  • Fine-tune the existing scripts and process to achieve increased performance and reduced load times for faster user query performance.

Environment: Informatica Power center 8.6.1/9.1.1, Oracle 10g, Teradata 12/13, SQL, Flat Files, MySQL, Toad, Sql Assistant, Unix, Windows, T-SQL.

We'd love your feedback!