Informatica Developer Resume
Sunnyvale, CaliforniA
PROFESSIONAL SUMMARY:
- 8 years of IT experience in design, analysis, development, documentation, coding, and implementation including Databases, Data Warehouse, ETL Design, Oracle, SQL Server databases, Informatica MDM 10.x/9.x, Informatica Data Director, Informatica Power Center 10.x/9.x/8.x/7.x Informatica Data Quality.
- Experience working on Informatica MDM to design, develop, test and review & optimize Informatica MDM (Siperian).
- Experience in developing landing tables, staging tables, base object tables and providing the relationships between the tables based on the Data Model.
- Worked on informatica Data Director to implement Customer 360 - degree view for Business Users.
- Strong Data Warehouse experience in using Informatica Power Center client tools like Mapping Designer, Workflow Manager, Workflow Monitor and Repository Manager.
- Well expertise on Software Development Life Cycle including requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.
- Design and implementation of Data Mart/Data warehouse using Informatica and developed transformations, mapping/mapplets, Sessions and workflows.
- Integrated various data sources like Oracle, flat files, XML, MS SQL Server and Teradata into the staging area, ODS, Data Warehouse and Data Mart.
- Experience in designing/developing complex mapping using transformations like Connected and Unconnected Lookup, Router, Filter, Expression, Aggregator, Normalizer, Joiner, Union and Update Strategy.
- Created and maintained Informatica users and privileges.
- Strong Knowledge on E-R modeling and Dimensional Data Modeling Methodologies like Star schema and Snowflake Schema used dimensional and multidimensional modeling.
- Experience in using Informatica Utilities like Pushdown Optimization, partition and implemented SCD (Slowly Changing Dimension) Type 1, Type 2, Type 3 methodologies.
- Extensive experience in conducting Requirement-gathering sessions, writing Business Requirement Document (BRD), Functional Requirement Document.
- Experience in using SQL, PL/SQL, SQL*Loader, SQL*Plus, Import/Export utilities.
- Worked with Oracle Stored Procedures, Triggers, Indexes, DML, DDL, Database Links, Sequences.
- Worked on UNIX shell scripts used in scheduling Informatica pre/post session operations.
- Experience in API tools like Soap UI, Postman and Swagger Editor.
- Excellent communication and documentation skills using tools like Visio and PowerPoint.
- Have excellent analytical, problem solving & interpersonal skills and can work as a part of a team as well as independently.
TECANICAL SKILLS:
ETL Tools: Informatica MDM (10.x/9.x), Informatica Power Center (10.1/9.x/8.x/7.1/6.x/5.x), Informatica Data Director, Snap logic.
Databases: Oracle 12c/11g/10g/9i, MS SQL Server 2008/2005/2000, DB2, MemSql, Postgres, Netezza.
Data Modeling: Dimensional Data Modeling, Star Schema Modeling, Snowflake Modeling, FACT and Dimension Tables, Physical and Logical Data Modeling.
Environments: UNIX, Windows 2000/XP/7, Windows server 2000/2003, HP-Unix, Linux
SQL, PL/SQL, T: SQL, Oracle, XML, UNIX, Visual basic, XML.
Tools & API: Toad, Visio, SQL developer, ERWIN, MS Office Suite, SOAP UI, Postman, Swagger Editor.
PROFESSIONAL EXPERIENCE:
Informatica Developer
Confidential, Sunnyvale, California
Roles &Responsibilities:
- Analyzed business requirements, following standard change control and configuration management practices and conforming to departmental application development standards and systems life cycle.
- Design, development and implementation of new system components or fixes to resolve system defects.
- Participated in software design and programming reviews.
- Designed and built data models to conform with our existing EDW architecture.
- Incorporated source code reuse wherever possible.
- Understanding data ETL concepts and best practices.
- Extensively worked in developing ETL program for supporting Data Extraction, transformations and loading using Informatica Power Center.
- Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
- Effectively worked on Onsite and Offshore work model.
- Responsible for determining the bottlenecks and fixing the bottlenecks with performance tuning using Netezza Database.
- Created Netezza SQL scripts to test the table loaded correctly.
- Pre and post session assignment variables were used to pass the variable values from one session to other.
- Developed UNIX shell scripts to run the Informatica workflows and controlling the ETL flow.
- Created user variables, property expressions, script task in SSIS.
- Implementing various SSIS packages having different tasks and transformations and scheduled SSIS packages.
- Responsible for architecting and implementing very large-scale data intelligence solutions around Snowflake Data Warehouse.
- Experience in architecting, designing and operationalization of large-scale data and analytics solutions on Snowflake Cloud Data Warehouse.
- Sets up and executes component tests as well as tracks and documents system defects.
- Migrating the code to higher environments and handling it to QA team for SIT and UAT.
- Worked with teams to deliver effective, high-value reporting solutions by leveraging an established delivery methodology.
- Performed data mining and analyzed to uncover trends and correlations to develop insights dat can materially improve our decisions.
ENVIRONMENT: Informatica Power Center 10.4, Toad for Oracle 13.1, SQL Server, SSIS, Unix Shell Scripts, Netezza, WinSCP, Windows XP.
Informatica MDM Developer
Confidential, Houston, Texas
Roles &Responsibilities:
- Interacted with business executives in gathering requirements, designed and translated Business Requirements into MDM’s Technical Requirement Specifications.
- Extensive hands-on experience designing and developing all aspects of MDM solutions.
- Hands-on experience with Informatica MDM Hub configurations - Data Mappings (Landing, Staging and Base Objects), Data Validation, Match and Merge rules, Customizing/Configuring informatica Data Director applications using entity 360.
- Experience in defining and configuring landing tables, staging tables, base objects, lookups, query groups, queries/custom queries, packages, hierarchies and foreign-key relationships.
- Expertise in Informatica MDM Hub Match and Merge rules, Batch Jobs and Batch Groups.
- Created various Match rules. Two match rules, Fuzzy match will be set based on token non as tokenization and Exact match will give exact records information.
- Designed and Developed multiple cleanse functions, graph functions.
- Maintained Informatica Data Director (IDD) for data governance to be used by the Business Users, IT Managers and Data Stewards.
- Configured the Informatica Data Director (IDD) application using hierarchy configuration and created subject area groups, subject areas, subject area child, IDD display packages in hub and search queries for searching the data.
- Experience in developing the customer 360-degree view, Authoring screens for Business Users.
- Experienced in Custom MDM and IDD User Exits.
- For businesspeople to login into the IDD screens created the Users in Hub console provided the roles and privileges using Security Access Manager (SAM).
- Developed the integration between Informatica MDM with Active VOS and SIF and IDD.
- Hands-on experience in configuring the Active VOS in windows server.
- Experienced in Active VOS workflow design and development, creation of Human task.
- Created the Bulk Import functionality for initial load and incremental loads in IDD. Implemented the XML templates.
- Configured the Soap based SIP API calls by using Soap UI for performing multiple tasks like Clean table, ExecuteBatchDelete etc.…
- For ETL development we used ETL tools like Informatica and Snap Logic.
- Development experience in building Snap logic pipelines, error handling, scheduling tasks & alerts.
- Experienced in development, testing and implementations with end to end implementations in Snap logic.
- Strong experience in Toad SQL Server Database.
- Created the CMX backup database to back up the existing CMX database in SQL server.
- Created User Exists we have used JAVA coding.
- Good Understanding in JSON, JAVA development, SOAP UI, WSDL File.
ENVIRONMENT: Informatica MDM Hub 10.2/10.3, Toad SQL Server 2012, MemSql, Postgres, Windows XP, Informatica Data Director, Snap logic, SOAP UI, Postman.
ETL Developer
Confidential, Minneapolis, Minnesota
Roles &Responsibilities:
- Interactive collaboration with business analysts and gathered functional requirements and designed technical design documents for ETL process.
- Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
- Operated Informatica Power Center to load data from different sources like flat files and Oracle, Teradata into the Oracle Data Warehouse.
- Maintain current ETL frameworks and methodologies using the Netezza SQL and Unix Shell scripting.
- Developed the ETL programs in Netezza to load the data periodically into data warehouse.
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
- Hands On experience creating, converting oracle scripts (SQL, PL/SQL) to TERADATA scripts.
- Configure rules for power center operations team, no file monitoring, process not started, reject records and long running jobs.
- Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Assisted the QC team in carrying out its QC process of testing the ETL components.
- Created pre-session and post-session shell scripts and email notifications.
- Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
- Involved in Data Quality checks by interacting with the business analysts.
- Performing Unit Testing and tuned the mappings for the better performance.
- Maintained documentation of ETL processes to support noledge transfer to other team members.
- Systemized in writing UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
- Developed ETL mappings, transformations using Informatica Power Center 9.6.
- Developed data Mappings between source systems and target system using Mapping Designer.
- Developed shared folder architecture with reusable Mapplets and Transformations.
- Extensively worked with the Debugger for handling the data errors in the mapping designer.
- Created events and various tasks in the workflows using workflow manager.
- Responsible for tuning ETL procedures to optimize load and query Performance.
- Implemented pushdown, pipeline partition, persistence cache for better performance.
- Setting up Batches and sessions to schedule the loads at required frequency using Informatica workflow manager and external scheduler (Autosys).
- Applied Business rules dat identify the relationships among the data using Informatica Data Quality.
- Developed Unit test cases and Unit test plans to verify the data loading process and Used UNIX scripts for automating processes .
- Tested the mapplets and mappings as per Quality and Analysis standards before moving to production environment.
- Taken part of Informatica administration. Migrated development mappings using Deployment groups as well as hot fixing them in production environment.
- Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
ENVIRONMENT: Informatica Power Center 9.6, Informatica IDQ 9.5, Power Exchange, Teradata, Data Quality Oracle12c, Netezza, MS Access, UNIX Shell Scripts, Windows NT/2000/XP, SQL Server 2008, Netezza, DB2.
Informatica Developer
Confidential, saint louis, Missouri
Roles &Responsibilities:
- Drafting the requirements, implementing design and development of various components of ETL for various applications.
- Data extraction from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.
- Designed and developed Informatica Mappings and Sessions based on user requirements and business rules to load data from source flat files and RDBMS tables to target tables.
- Used Source Analyzer and Warehouse designer to import the source and target database schemas, and the Mapping Informatica Designer to create complex mappings from Business requirements.
- Designed and Developed IDQ mappings for address validation / cleansing, doctor master data matching, data conversion, exception handling, and report exception data.
- Used informatica power center to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files.
- Created and Configured Workflows, Worklets, and Sessions to transport the data to target warehouse Netezza tables using Informatica Workflow Manager.
- Integration of various data sources like Oracle, SQL Server, Fixed Width & Delimited Flat Files, DB2.
- Involved in integration services SSIS creating packages dat move data from a single data source to a destination with no transformations.
- Involved in the Unit Testing and Integration testing of the workflows developed.
- Extensively worked with Korn-Shell scripts for parsing and moving files and even for re-creating parameter files in post-session command tasks.
- Imported Source/Target Tables from the respective databases and created reusable transformations like Joiner, Routers, Lookups, Filter, Expression and Aggregator and created new mappings using Designer module of Informatica.
- Created different rules as mapplets, Logical Data Objects (LDO), workflows. Deployed the workflows as an application to run them. Tuned the mappings for better performance.
- Working with database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
- Developed shell scripts for running batch jobs and scheduling them.
- Handling User Acceptance Test & System Integration Test apart from Unit Testing with the halp of Quality Center as bug logging tool. Created & Documented Unit Test Plan (UTP) for the code.
ENVIRONMENT: Informatica Power Center 9.6, Informatica Multi Domain MDM 9.1, Informatica Data Quality (IDQ) 9.5, Oracle 11g, SQL Server, PL/SQL, Netezza, Unix and WINSCP, SSIS.
Informatica Developer
Confidential
Roles &Responsibilities:
- Actively analyzed with business analysts and data analysts to understand and analyze the requirement to come up with robust design and solutions.
- Converted business requirements into highly efficient, reusable and scalable Informatica ETL processes.
- Utilized dimensional and star-schema modeling to come up with new structures to support drill down.
- Created mapping documents to outline source-to-target mappings and explain business-driven transformation rules.
- Involved in standardization of Data like of changing a data set to a new standard.
- Dealt with massive data profiling prior to data staging.
- Generated Technical Specification Documents and Solution Design Documents to outline the implementation plans for the requirements.
- Involved in testing of Stored Procedures and Functions, Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
- Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Power Center 9.5 by using various transformations like Expression, Source Qualifier, Filter, Router, Sorter, Aggregator, Update Strategy, Connected and unconnected look up etc.
- Created Informatica components required to operate Data Quality (Power Center required)
- Designed best practices on Process Sequence, Dictionaries, Data Quality Lifecycles, Naming Convention, and Version Control.
- Developed scripts for creating tables, views, synonyms and materialized views in the data mart.
- Created PL/SQL programs like procedures, function, packages, and cursors to extract data from Target System.
ENVIRONMENT: Informatica Power Center 9.5, Oracle11g, Teradata, UNIX Shell Scripts.
ETL Analyst
Confidential
Roles & Responsibilities:
- Interacted with business analysts and translate business requirements into technical specifications.
- Using Informatica Designer, developed mappings, which populated the data into the target.
- Used Source Analyzer and Warehouse Designer to import the source and target database schemas and the mapping designer to map the sources to the targets.
- Responsibilities included designing and developing complex Informatica mappings including Type-II slowly changing dimensions.
- Worked extensively on Workflow Manager, Workflow Monitor and Worklet Designer to create, edit and run workflows, tasks, shell scripts.
- Developed complex mappings/sessions using Informatica Power Center for data loading.
- Enhanced performance for Informatica session using large data files by using partitions, increasing block size, data cache size and target-based commit interval.
- Extensively used aggregators, lookup, update strategy, router and joiner transformations.
- Developed the control files to load various sales data into the system via SQL*Loader.
- Extensively used TOAD to analyze data and fix errors and develop.
- Involved in the design, development and testing of the PL/SQL stored procedures, packages for the ETL processes.
- Developed UNIX Shell scripts to automate repetitive database processes and maintained shell scripts for data conversion .
ENVIRONMENT: Informatica Power Center, Oracle 9i, PL/SQL, TOAD, UNIX, Control-M.