We provide IT Staff Augmentation Services!

Informatica Developer Resume

4.00/5 (Submit Your Rating)

Cleveland, OH

SUMMARY:

  • 7+ years of Technical and Functional experience in Data warehouse implementations using Informatica PowerCenter 9.x/8.x, 11g/10g, MS SQL SERVER 2012/2008.
  • Extensive experience in Informatica PowerCenter in all phases of Analysis, Design, Development, Implementation and support of datawarehousing applications.
  • Expert in Database skills using SQL, PL/SQL Developer for debugging applications.
  • Strong knowledge in Data warehousing concepts, dimensional modeling Star Schema and Snowflakes Schema, Fact and Dimensional tables.
  • Extensively worked on Transformations such as Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, and Sequence Generator.
  • Experience in working with business analysts to identify study and understand requirements and translated them into ETL code in Requirement Analysis phase.
  • Worked with various salesforce.com(CRM) Standard objects like Accounts, Contacts, Leads, Cases, Campaigns, Reports, and Dashboard.
  • Strong Knowledge of SFDC standard Data structures and familiarity with designing Custom Objects and Force.com platform and Force.com Sites.
  • Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems
  • Worked on optimizing and tuning the Netezza SQLs to improve the performance of batch.
  • Extensively used Unix Scripting, Scheduled PMCMD and PMREP to interact with Informatica Server from command mode.
  • Experience with Install and configure and maintaining of MDM Hub and MDM application server (WebLogic, JBOSS) and MDM data stores.
  • Proficient in using Informatica workflow manager, Workflow monitor to create, schedule and control workflows, tasks, and sessions.
  • Used Netezza groom to reclaim the space for tables, databases.
  • Implemented Data Warehousing Best practices and delivering Quality code.
  • Extensive experience in the Data Analysis, Design, Development, Implementation and Testing of Data Warehouse.
  • Experienced with Informatica PowerExchange (5.x/8.x) for Loading/Retrieving data from mainframe systems.
  • Experience in documenting High Level and Low Level Design, Source to Target Mapping (STM), Macro and Micro Documents, Unit test plan, Unit test cases and Code Migration Report.
  • Experience with industry standard methodologies like waterfall, Agile within the software development life cycle.
  • Extensively used SQL and PL/SQL to write Stored Procedures, Functions, Packages and Triggers.
  • Experience in writing various ETL procedures in order to load data from different sources like Oracle 10g, DB2, XML Files, Flat Files, MS SQL Server and MS Access into Data marts, Data warehouse using Informatica Power Center.
  • Experience in optimizing query performance, session performance and fine tuning the mappings for optimum performance.
  • Expert in Data Modeling using Star Schema/Snowflake Schema, Fact and Dimensions tables and Slowly Changing Dimensions (SCD1, SCD2)
  • Proficient in the development of solutions using Informatica (Siperian) MDM Hub.
  • Implemented different slowly changing dimensions (SCD) using Informatica.
  • Experience in creating Reusable Tasks (Sessions, Command, Email) and Non - Reusable Tasks (Decision, Event Wait, Event Raise, Timer, Assignment, Worklet, Control).
  • Worked on performance tuning, identifying and resolving various performance bottlenecks at various levels like sources, mapping and sessions.
  • Worked with Oracle Stored Procedure, Table Partitions, Triggers, SQL Queries, PL/SQL Packages and loading data into Data warehouse, Data marts using Informatica.
  • Extensive experience in writing scripts and also automation of the ETL processes using UNIX shell scripting.
  • Extensive experience on working with scheduling tools like SAS Management console, Crontab and lsfadmin tools.
  • Expert in debugging, troubleshooting, monitoring, and performance tuning of Sources, Mappings, targets and sessions.
  • Quick learner, excellent team player Possess good interpersonal, presentation and development skills with strong analytical and problem solving approach.

TECHNICAL SKILLS:

ETL Tools: Informatica PowerCenter 9.x/8.x (PowerCenter Repository, Informatica Designer, Workflow manager, Workflow monitor), Power Exchange, Developer (IDQ) and Analyst (IDE), Informatica Cloud.

BI & Reporting: Business Objects Xi R3, Crystal Reports, SSRS.

Data Modeling: Relational Modelling, Dimensional Modelling (Star Schema, Snow-Flake, Fact, Dimensions), Data Quality, Entities, Attributes, ER Diagrams.

Databases: Oracle 11g/10g, MS SQL Server 2012/2008, Teradata V2R6/V2R5, MS access, DB2.

Job Scheduling & other tools: SQL plus, PL/SQL Developer, Toad, SQL* Loader, Teradata SQL assistant 7.1, lsfadmin, Crontab, Control-MEnvironments UNIX (Sun Solaris, AIX), Windows 2000/NT/XP/VISTA/7,8 Windows server 2008, Windows NT.Languages PL/SQL, SQL, XML, HTML, Data Structures, UNIX Shell Script.

Office Applications: MS Office 2003/ 2007/2013.

PROFESSIONAL EXPERIENCE:

Informatica Developer

Confidential, CLEVELAND, OH

Responsibilities:

  • Understanding the business rules provided to design and develop mapping for the ETL.
  • Involved in complete understanding of business requirement and involved in analyzing, Data profiling the sources to load in Oracle warehouse.
  • Involved in working with different sources like Flatfiles (both delimited and fixed width), Oracle and XML.
  • Prepared technical specifications for the development of Informatica (ETL) mappings to load data into various target tables and defining ETL standards.
  • Created Source to Target Mapping Documents as per requirement.
  • Responsible for working with DBA in order to calculate the estimated space required for each Source System depending on the requirement.
  • Installing and configuring MDM HUB software and setting up databases for the Hub store.
  • Involved in Oozie workflows creation to schedule the sqoop jobs to load the data from Hive views to Netezza tables.
  • Involved in working with 4-5 Source Systems in order to load them to build EDW environment.
  • Used UNIX environment in order to run the batches where the jobs are grouped based on the requirement.
  • Involved in basic testing required before migrating the objects to QA/UAT environment. processes running daily Performed testing both in UNIX and also ORACLE level based on requirement. Also, provided production support by monitoring.
  • Used Talend as ETL tool to pull the data from various upstreams databases and load the data into Netezza DB
  • Worked on multiple projects using Informatica developer tool IDQ of latest versions 9.1.0, 9.5.1 and 9.6.1.
  • Involved in migration of the maps from IDQ to power center.
  • Applied the rules and profiled the source and target table's data using IDQ.
  • Responsible for all the activities related to configuring Data Loader, uploading data in CSV files into salesforce.com(CRM), checking for the correctness of the data.
  • Developed mappings using Informatica to load the data from sources such as Relational tables, Flat files, Oracle tables into the target Data warehouse.
  • Configuring security access manager, resource loading and monitoring operations of MDM HUB.
  • Worked with Informatica Data Quality 9.6.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities using IDQ 9.6.1.
  • Perform System Analysis and Requirement Analysis, design and write technical documents and test plans.
  • Created a Hybrid process in IDQ by combining both IDQ Developer and analyst version through LDO (Logical Design Objects)
  • Structure tables in Netezza to get collocations during query executions, make use of stats, zone maps, materialized views, optimizer performance parameters, CBT and groom for optimal performance.
  • Migration of data from one Netezza host to another Netezza host using NzMigrte utility.
  • Worked on IDQ Analyst for Profiling, Creating rules on Profiling and Scorecards.
  • Involved in performing Reconciliation based on the Source and Target.
  • Created and scheduled Workflows using Informatica workflow manager.
  • Responsible for developing complex Informatica mappings using different types of transformations like UNION transformation, Connected and Unconnected LOOKUP transformations, Router, Filter, Aggregator, Expression and Update strategy transformations for Large volumes of Datasets.
  • Implemented SCD type2 using product serial number as per requirement.
  • Extensively used Parameter files, mapping variables in the process of development of the mappings for all the dimension tables.
  • Responsible for Error Handling in Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
  • Created Connected, Unconnected and Dynamic Lookup transformations for better performance.
  • Extensively used Parameter files, mapping variables in the process of development of the mappings for all the dimension tables.
  • Scheduled automatic workflows using Toad Scheduler
  • Worked on handling performance issues, Troubleshooting of Informatica Mappings, evaluating current logic for tuning possibilities.
  • Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and performance related issues.
  • Responsible for creating production support documentation.
  • Coordinate with business team, get the requirements from them and then coordinate with the team to get the work done, keeping timeline in mind.
  • Involved in team weekly and by monthly status meetings.

Environment: Informatica Power Center 9.1/9.5.1/9.6.1, Informatica MDM HUB 10.1.1,salesforce.com(SFDC), IDQ 9.1/9.5.1, Oracle 11g, PL/SQL, TOAD, UNIX Shell Scripts

Informatica Developer

Confidential, Baltimore, MD

Responsibilities:

  • Worked on handling large amount of data usually in millions per day.
  • Handled live data which usually come in millions depending on number of customers.
  • Designed a system which accepts the requests based on certain flags and created an extract for handling such requests.
  • Stored the customer’s data in a dimension table and a fact table will be incrementally loaded.
  • Worked with dynamic SQl transformation which involves lot of joins on fact and dimension tables.
  • Used pre SQl of source qualifier which involves handling many sql queries.
  • Worked on Mapping and workflow variables and also how to use the mapping variables in workflow and viceversa.
  • Used lot of functions like set variable function in order to assign the incoming values to mapping variables.
  • Created mappings in such a way where we connect all the mappings using mapping and workflow variables which are part of a single workflow.
  • Reading the data from the FACETS and HpXr data model then written ETL Functional Design documents and ETL technical Specification documents for program development, logic, coding, changes and corrections.
  • Involved in analyzing different modules of facets system and EDI interfaces to understand the source system and source data.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Several Netezza SQL scripts are written to load data into Netezza tables.
  • Involved in processing claims in FACETS and validating the full cycle process to make sure the checks are generated and 835's are generated.
  • Involved in Preparing Test Plan for Med Pharmacy Claims to FACETS.
  • Developed Test Cases bases on Functional Specifications and Design Documents for Med Pharmacy Claims to FACETS
  • Worked with other teams on Facets Data model design and Facets batch processing.
  • Used pre-session success variable assignment and also post session success variable assignment at non-reusable session task in order to pass the workflow varibales to mapping and vice versa.
  • Created UNIX scripts in order to schedule or unschedule the workflows based on requirement.
  • Created concurrent workflow which will be called form a different workflow using command task.
  • Worked on continuous scheduling of informatica workflow.
  • Used decision task, assignment task, email task based on requirement.
  • Created parameter files at workflow level and also merged both of the parameter files used at workflow and session level.
  • Loading data in to Netezza tables using external table process with Net Prophet ETL.
  • Created unit test cases and also documented all of them
  • Used lookup transformation where I wrote a lookup override in order to divide the comma separated values into multiple rows by using CONNECT clause in oracle.
  • Used oracle minus queries in order to compare the data between two schemas.
  • Create a mapplet which can be reused across two mappings and functions to give a different output based on incoming data.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Used transactional control transformation in order to generate dynamic files at mapping level.
  • Used an approach to calculate the dynamic files by dividing with timestamp at session level.
  • Worked on performance tuning where we increase the throughput by switching from Sql transformation and using the same query in source qualifier.
  • Used parallel hint clause for increasing performance which divides across 16 CPU’s to run an oracle query which has multiple joins.
  • Worked on informatica dynamic partitions which can be used for loading large amount of history data.
  • Worked on calling the same workflow multiple times until a condition is satisfied which is handled in UNIX script.
  • Involved in code migration from development to integration testing, Staging and finally to Production Environment.
  • Prepared documentation required to migrate all the objects across different environments.

Environment: Informatica Power Center 9.1, Facets, Oracle 11g, PL/SQL, Informatica Master Data Management ( MDM ) 9.5.1 UNIX Shell Scripts, windows XP.

ETL Developer

Confidential, Santa Clara, CA

Responsibilities:

  • Interacted with the Business Analysts to understand the business & gather technical requirements.
  • Interacted with Architect to understand the High-Level Design.
  • Helped in creation of mapping specs for better code processing.
  • Abstracted relational and delimited text file to local environment and developed the code in Dev environment.
  • Checking the file formats for the source flat files using UNIX shell scripts thereby ensuring input file formats are same as specified.
  • Created the mappings using transformations such as the Source qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy, Sequence generator and Java Transformation.
  • Created Mapplet for error handling process to deal with null records.
  • Developed code for landing environment and then for staging and finally developed Incremental load populate the target tables for atomic model.
  • Developed code to move data to MDM environment and finally loaded target tables using truncate load.
  • Used DTstudio for B2B exchange of HL7 messages to our source landing.
  • Worked on PowerCenter real time edition to run the Real-time workflows.
  • Configure session properties for real time workflows.
  • Monitor health of the Netezza hardware and filing a case with IBM in case of any issues.
  • Loaded Historical data for specific period of time to the Atomic model.
  • Developed code for backload and Incremental load.
  • Extracted HL7 messages from queues.
  • From atomic model loaded the Normalized model with the help of OHADI.
  • This normalized model was used as a staging area for the dimensional model which was Netezza based dataWarehouse.
  • Worked on Aginity workbench to process all the relational objects of Netezza.
  • Data analysis and Datatype analysis was done on aginity workbench for Netezza objects.
  • Created mappings to load backload and clean data to Netezza staging environment.
  • Used SCD type 1 to load dimension tables in Netezza and finally Fact tables of DataWarehouse.
  • Used truncate load to load Netezza tables.
  • Created Netezza Sql scripts to test the table loaded correctly.
  • Created Reusable sessions in Task Developer.
  • Used Informatica scheduler to run the workflows in Dev Environment.
  • Worked on Informatica Power Center tool - Source Analyzer, Mapping Designer and Mapplet Designer, Transformations, Informatica Repository Manager and Informatica Workflow Manager.
  • Extensively used SQL Developer to check if data is properly loaded into target systems by writing SQL Queries.
  • Created and scheduled Sessions, Jobs based on demand, run on time and run only once using Workflow Manager.
  • Used Debugger in troubleshooting the existing mapping.
  • Performed Unit test and Integration test for the process created in Informatica and all the test cases were well documented for process improvements.
  • Created test scripts and Unit test plan documents.
  • Created Deployment Document for Migration of code from Dev to Test and from Test to Production environment.
  • Involved with testing team to test the code and Fix the code

Environment: Informatica PowerCenter 9.1, Informatica PowerCenter Real Time Edition, B2B DT studio, MDM, Oracle 11g, SQL developer, Putty, Winscp, Unix shell scripting, PL/SQL, Netezza, Aginity workbench.

Confidential, Hopkins, MN

ETL Developer

Responsibilities:

  • Primarily responsible to convert business requirements (BRD's) into ETL Specifications.
  • Worked as Onsite Coordinator.
  • Experience in working on extracting data from different sources likes IBM MQ, Oracle, MS Access, Flat files and Teradata.
  • Implemented SFDC Sales Cloud, Service Cloud, Web Services, Created Group, Deal Rooms provisioning and marketing teams.
  • Used ODI to perform data integration to ELT processing. Data is extracted from multiple sources, sent through several transformation processes and loaded into a final destination target.
  • Researched the market for the future ERP, CRM, SCM for their newly created company
  • Ability to write complex SOQL, SOSL queries across multiple objects within the SFDC database
  • Developed complex mappings in Informatica to load the data using different transformations and maintainedInformatica naming standards.
  • Used ODI to effectively integrate heterogeneous data sources and converting raw data into useful information.
  • Experience in Agile methodology as a lead and worked with offshore team
  • Implemented various Transformations: Expression, Joiner, Sorter, Aggregator, Lookup, Filter, Router, Update Strategy and Transaction Control Transformations.
  • Prepared unit test cases for developed mappings, workflows
  • Written complex SQL queries to analyze problems.
  • Used ODI Designer as a data mapping/business rules implementation and developed mappings and workflows to transfer data from source system data to target systems.
  • Collaborating with multiple internal stakeholders and prospect vendors to finalize ERP, CRM and SCM applications for the newly spun-off line-of-business
  • Used Data Analyst tool for performing data validation, running profiles and score cards.
  • Involved in Data Cleansing using Informatica and Data Analyst tool.
  • Developed Shell Scripts for the generation of reports and transferring over the email.
  • Performance tuning for mappings, Sessions where ever applicable.
  • Lead ERP selection process with Steering Committee which resulted into signing SOW with NetSuite ERP
  • Supported UAT & Fixed the Issues raised by users
  • Developed UNIX shell scripts to automate daily loads and used Informatica scheduler to Schedule workflows.
  • Lead Data extraction, cleansing, Data Transformation and Data Migration into (Cloud Based) NetSuite ERP/CRM
  • Successfully used Agile/Scrum Method for gathering requirements and facilitated user stories workshop. Documented User Stories and facilitated Story Point discussions to analyze the level of effort on project specifications.
  • Created complex custom metadata models and templates using Informatica Metadata Manager
  • Involved in creation of various reusable custom models for creating the metadata catalogs
  • Created complex mappings using Designer to pull the metadata from csv files of various application systems using Informatica.
  • Raised Clear Quest request (CQ) to migrate codes, Tables (new), Grant requests for tables, from dev to QA, UAT and Production.
  • Involved in Installation and configuration of Informatica MDM hub.
  • Worked in Incident and change management processes (SLA).

Environment: Informatica Powercenter, Metadata Manager, MDM, Data Analyst tool, Agile, SFDC, SQL Developer, Putty, WinSCP, Oracle 10g, Teradata, SQL, PL/SQL and Unix Shell Scripting NetSuite ERP/OneWorld, CRM, ODI, Manufacturing (Cloud based Mobile Solution)

ETL Informatica Developer

Confidential, Dallas, TX

Responsibilities:

  • Interacted with Business Analysts in order to gather the requirements.
  • Analyzed the requirements to identify the necessary tables that need to be populated into the staging database.
  • Prepared the DDL's for the staging/work tables and coordinated with DBA for creating the development environment and data models.
  • Worked with Power Center Designer tools in developing mappings to extract and load the data from XML files into different FlatFile formats
  • Worked on Informatica Power Center tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Designed and Developed mappings by using Lookup, Expression, Filter, Update Strategy, Aggregator, Router transformations to implement requirement logics while coding a Mapping.
  • Developed Error handling & data quality checks in Informatica mappings.
  • Schedule and run Extraction and Load process, monitor task and workflow using the Workflow Manager and Workflow monitor.
  • Used Workflow Manager for creating workflows, work lets, email and command tasks.
  • Used Informatica features to implement Type II changes in slowly changing dimension (SCD) tables.
  • Involved in Performance Tuning of SQL Queries, Sources, Targets and sessions by identifying and rectifying performance bottlenecks.

Environment: Informatica Power Center 8.6.1, Oracle10g, Flat files, Shell

We'd love your feedback!