We provide IT Staff Augmentation Services!

Informatica Developer Resume

3.00/5 (Submit Your Rating)

Southlake, TX

SUMMARY:

  • 7 years of strong experience in performing ETL processes like Data Extraction, Data Transformation and Data Loading with Informatica Power Center9.x, 8.x, 7.x, and 6.x.
  • Experience in all the phases of the Data Warehouse project life cycle including Requirement Gathering, Design, Development, Testing, UAT, and Implementation of Data warehouses using ETL, Online Analytical Processing & reporting tools.
  • Extensive knowledge of various kinds of Informatica Transformations such as Source Qualifier, Aggregate, Lookup, Rank, Joiner, Filter, Router, Sorter, Sequence, Union, Update Strategy, Stored Procedure, Normalizer, xml Transformation.
  • Experience in Repository Configuration, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer / Workflow Manager to move data from multiple source systems (Oracle, SQL server, Flat files) to targets.
  • Strong knowledge in RDBMS Concepts, Dimensional Data Modeling (Star Schema, Snow - flake Schema, FACT and Dimension tables), Logical and Physical Data Modeling.
  • SQL/Database developer experience in writing efficient SQL queries, PL/SQL Scripts, fine tuning queries and wrote several SQL queries for adhoc reporting.
  • Experience in maintenance and enhancement of existing system. Experience in debugging the failed mappings and developing error-handling methods as part of production support.
  • Experience in UNIX Shell Scripting in order to fetch the input files and archive them for the Power center usage.
  • Ability to quickly grasp available technology, tools and existing client Architecture.
  • Excellent written and communication skills, analytical skills with the ability to perform independently as well as in a team.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 9.x/8.x/7.x/6.x (Workflow Manager, Workflow Monitor, Warehouse Designer, Transformation developer, Mapplet Designer, Mapping Designer, Repository manager, and Informatica server), Power exchange 8.x/7.x

Database & Related Tools: Oracle 10g/9i/8i, MS SQL Server 2005/2008, DB2, SQL*Plus, SQL Loader, Ms Access, Toad for Oracle, RapidSQL, FileZilla

Data Modeling Tools: Erwin 4.0 (Dimensional Data Modeling, Star schema, Snowflake schema), Physical and Logical Data Modeling, ERStudio

BI & Reporting Tools: OBIEE 10.1.3.4, Business Objects 6.5

Languages: SQL, PL/SQL, C, C++, JAVA, Unix Shell Scripting, JIL

Operating Systems: Unix. Microsoft Windows 98/NT/2000/2003/XP/2007/Vista/2010

Other: MS Office 2003/2007/2010/2013- Word, Excel, Outlook, IBM Lotus Notes, PowerPoint, Notepad ++, Notepad, MS Visio, OneNote, Adobe Acrobat

PROFESSIONAL EXPERIENCE:

Confidential, Southlake, TX

Informatica Developer

Responsibilities:

  • Refine the calls data at the segment level for more accurate call counting.
  • Create parent call perspective that groups all segments of the call.
  • Linking Customer Interaction History data to master call data.
  • Studied and assess the existing STM (Source to target) mapping documents for all the schemas, Staging (STG), Data warehouse (DW) and Data Mart (DM), logical data models for each schema and the data definition training documents to get familiar with the data.
  • Got familiar with entire ETL process including workflows, sessions, associated mappings, Stored Procedures and job schedulers.
  • Participated in Data Model design sessions with Data modeler, BSA and provided valuable input impacting the ETL build work.
  • Created ETL framework and design document after the PDM (Physical Data Model) was ready.
  • Assisted BSA in finalizing the data requirements from the source system and consumed the data files (pipe delimited quoted)and loaded to the staging schema without any transformation
  • Provided technical input to BSA while creating the STM document.
  • Worked with Project Manager to measure the total effort and development approach.
  • Assisted BSA in implementing the FI global file transfer mechanism standards (SFTP).
  • Designed and developed mappings, defined workflows and tasks, monitored sessions, exported and imported mappings and workflows to populate the Data Warehouse (DW) and Data Mart (DM) tables as per the business rules defined in STM.
  • Created Mapping Parameters &Variables and used but not limited to Lookup, Joiner, Expression, Source Qualifier, Router, Filter, Aggregator, Sequence transformations as part of ETL logic.
  • Created Autosys JIL scripts as file watchers and to invoke the pearl script’s after arrival of the source files.
  • Used UNIX scripts to decrypt, copy, archive, delete, and validate the files and to call the Informatica workflows.
  • Responsible for packaging the code for every SPRINT and deliver it to the delivery manager. Package included checking in the Informatica code in PERFORCE, new or updated JIL scripts, DDLs and the release notes.
  • Executed complex SQL queries for unit testing, data analysis and data profiling.
  • Experienced with writing Oracle procedures, functions, packages, database, triggers and troubleshooting PL/SQL and SQL.
  • Supported ETL jobs in production. Troubleshoot and provided solutions in case of failures. Been the primary Production Support resource 4 times a year and secondary for 4 times a year. Identified recurring issues/alerts and conduct root cause analysis
  • Ensured all deliverables conform to ETL best practices and standards.

Environments: Informatica Power Center 9.0.1, (Repository Manager, Mapping Designer, Workflow Manager and Workflow Monitor), MS SQL server 2008,Oracle, Toad for Oracle 12.1, SFTP, UNIX, VISIO, Microsoft Office 2013, Rapid SQL 8.6, PERFORCE, P4V. SecureCRT 7.1, NOTEPAD++

Confidential, Overland Park, KS

Informatica Developer

Responsibilities:

  • Involved in designing, developing and documenting of the ETL process (Extract, Transformation and Load) to populate the Data from various source system feeds using Informatica.
  • Involved in sourcing the xml rows from source database
  • Parsed the xml filtering the key/value combinations which are not needed. Created the target file which is pipe delimited txt file with each containing policy id, key, value.
  • Created data files which are ready to be loaded into the target db tables.
  • Loading the data files into the final target database.
  • Partitioned sessions for concurrent loading of data into the target tables.
  • Designed and developed mappings, defined workflows and tasks, monitored sessions, exported and imported mappings and workflows.
  • Created and configured workflows and sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Created Mapping Parameters and Variables.
  • As part of the INBOUND Interface project, have run the (PERL and UNIX) scripts to decrypt files from Insurity and as part of OUTBOUND Interface project, have run (PERL and UNIX) scripts to encrypt and send files to Insurity Server.
  • Connected to Insurity FTP site from UNIX server to download the files and used SmartFTP for transferring the files.
  • Used SQL tools like SQL server management studio to run SQL queries and validate the data pulled in reports.
  • Complete understanding of data warehouse architecture, star schemas and snowflakes.
  • Experienced in data profiling, data analysis, data quality checking.
  • Experienced with documenting data models.
  • Supporting day-to-day issues as well as new pieces of work
  • Identified recurring issues/alerts and conduct root cause analysis

Environments: Informatica Power Center 9.1.0, (Repository Manager, Mapping Designer, Workflow Manager and Workflow Monitor), MS SQL server 2008, Putty, SmartFTP, FileZilla 3.5.3 and UNIX

Confidential, Dallas, TX

Informatica Developer

Responsibilities:
  • Create multiple tables in Data warehouse that collect all the attributes of the calls received through the Call Center System.
  • Partitioning of the Call interactions at three different levels, namely, Call Interaction, Call mediation and the Master Interaction.
  • Involved in the business requirement sessions to learn which call data elements business is interested in and how is it used in their reporting needs.
  • Reviewed the business requirements with BSA around what data needs to be brought into the data warehouse and data mart.
  • Contributed in sessions with DBA, Architects and BSA for designing, developing and documenting of the ETL process (Extract, Transformation and Load) to populate the Data from Call Center System source system feeds using Informatica.
  • Assisted BSA in finalizing the data pull process from the source system (SQL Server 2005) and the scheduling portion of various database instances of the source system based on the call center site.
  • Used Autosys to schedule the stored procedures used to pull the data from various source databases and load it into the staging environment without applying much transformation logic.
  • Designed and developed mappings, defined workflows and tasks, monitored sessions, exported and imported mappings and workflows to extract data from staging environment, apply business rules with Informatica transformation and load the data into Data warehouse and data mart environments.
  • Created Mapping Parameters and Variables and used Source Qualifier, Lookup, Router, Filter,Joiner, Aggregator, Sequence, Expression transformations as part of ETL logic.
  • Responsible for packaging the code for deployment, package included the DDLs, Unix scripts, JIL scripts and the Informatica code.
  • Compiled and executed complex SQL queries wherever needed in ETL process, Unit testing and the data validation after the ETL process has successfully completed.
  • Assisted BSA and QA team when needed.
  • Handled data analysis stories, system testing stories to assist BSA and QA team.

Environments:MS SQL server 2005, Toad for Oracle, Informatica Power Center 9.0.1, (Repository Manager, Mapping Designer, Workflow Manager and Workflow Monitor), Oracle, SFTP, UNIX, VISIO, Microsoft Office 2007. Perforce, Secure CRT, Notepad.

We'd love your feedback!