We provide IT Staff Augmentation Services!

Senior Informatica Developer Resume

4.00/5 (Submit Your Rating)

Austin, TX

SUMMARY:

  • Varun is an accomplished Informatica Developer 8  plus years of experience in Information Technology in requirement gathering, data analysis, design, development of software applications and expertise in Data warehousing solutions using Informatica Power Center and dealing with SQL, PL/SQL, Stored procedures, Functions, Packages, Triggers, Cursors.
  • Involved in creating Dimensional Data Modeling and creating Fact table and Dimensional tables;
  • Experienced in OLTP/OLAP system study, analysis and ER modeling, developing Database schemas like star schema and Snowflake schema used in relational and multidimensional modeling.
  • Extensive experience in extraction, transformation and loading of data directly from various data sources like flat files, Excel, SQL Server, Oracle.
  • Used Informatica Client Tools such as B2B Data Studio, Designer, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
  • Experience in dealing with various databases like Oracle 12g/10g/9i/8x/7x, SQL Server 2005/2003/2000 ,Teradata, DB2, MSSQL, Excel sheets, Flat Files, Sybase and XML files.
  • Extensive experienced in writing stored procedures (PL/SQL), triggers, Functions and Packages.
  • Experience in Administration activities like Creating and Managing Repositories, Users, User Groups, Folders, Working with administrator functions of Repository Manager.
  • Working experience in using Informatica Workflow Manager to create Sessions, Batches and schedule workflows and Worklets.
  • Experience in creating time stamps between source data and target by using change data capture (CDC) mechanism.
  • Well Experienced in doing Error Handling and Troubleshooting using various log files.
  • Experienced in creating Jobs, Alerts using SQL Mail Agent. Well versed in various high availability solutions like clustering, mirroring, log shipping.
  • Experiences in devising Disaster recovery strategies and effectively testing them.
  • Experience of designing strategies to maintain audit tables, load balance, exception handling and high data volumes.
  • Good exposure to Development, Testing, Debugging, Implementation, Documentation, End-user training and Production support.

TECHNICAL SKILLS:

  • Informatica Power Center 9.6.1/ 9.5/9.1/8.6.1 /8.1.1/7.1.2/6.2/5.1
  • Power Exchange Oracle12c/11g/10g/9i/8i/7x
  • Toad
  • Power Exchange Change data capture CDC MS SQL Server 2008/ 2005/2003/2000
  • Sybase12.0/11.x/10
  • DB2
  • UDB 7.2
  • My SQL 5.0/4.1
  • Teradata
  • MS Access 2003/2000/97/7.0. Editors SQL Navigator
  • Toad. Windows7/ NT/2000/XP
  • MS - Dos
  • UNIX
  • HP
  • Sun Solaris 2.x/7/8
  • Win NT
  • IBM AIX
  • Exceed (Editor) Erwin 4.1/4.0
  • SQL Loader 11g/10g
  • ER-Studio
  • Cognos 8.4/7.1/6.0 (Impromptu
  • Power Play
  • Transformer
  • MS SQL Server Reporting services 2005
  • Business Objects XI
  • Crystal Reports 10
  • Crystal Reports 2008
  • Oracle Reports 2.5. SQL
  • PL/SQL
  • T-SQL
  • C
  • C++ UNIX scripting
  • VBScript.IBM RTC Rational Team concert

PROFESSIONAL EXPERIENCE:

Senior Informatica Developer

Confidential, Austin, TX

Responsibilities:
  • Interactively participated in team meetings with various business teams and business analysts in gathering requirements and analyzing the requirements and proposing the best approach to adopt.
  • The data from DOMS application is migrated to the CAM passing through the customer hub system by cleansing and standardizing contacts.
  • Involved in creating the mappings with web service consumer transformation for calling the cam web service that is maintain cam contract.
  • Developed the web service consumer transformation by importing the wsdl and defining the port definitions.
  • Extensively involved in creating the web service request using the web service consumer transformation and http transformation.
  • Developed the packages and procedures in oracle for De queening the response xmls for the request.
  • Developed the Data transformation for parsing the response xml and cleansing the data by DS SAP services team to cleanse and standardized the contacts.
  • Extensively supported on the production issues in debugging and fixing the issue for free flow of the data to process the orders.
  • Worked with the team in scheduling the Informatica jobs and Database jobs through the control m scheduler.
  • Actively participated in sprint and daily status calls to discuss on the progress of the work and various issues which came across during integration.
  • Participated in the test case review sessions with the development team and testers to clarify there issues and make sure all the test cases have been tested before promoting it to the production.
  • Worked with CIL integration team to process the newly created customer‘s data to the cam application and mdm hub.
  • Created the UNIX shell scripts for SFTP of the sales rep id files from Informatica server to the omega server.
  • Involved in generation of reports on timely basis and configured through splunk to monitor the logs of service requests, responses and reports.
  • Created the Mappings and Mapplets Workflows for loading the data in stage tables and loading them to the initiate tables through the Informatica power center 9.6 .1 and generating the xmls for cleansed data and sent to the target server, so mdm batch processor is java application picks up the xmls and process the records to master hub space.
  • Validated the xml request and response using Soap UI on required basis.
  • Involved in documenting the designs, mapping documents.
  • The organization records and person records are been identified on application of business rules, xml is constructed using the XML transformation.
  • Designed the mappings mapplets & workflows for the associating organizational customer accounts to the credit party accounts by DFS (dell financial services) records.
  • Involved in Creating the stored procedures, packages, triggers, tables, views, synonyms, and test data in Oracle.
  • Extensively worked in developing the mappings and resolving the complex logic to resolve the performance issues in building the DNB service is been called using java transformation in order to send the request for the DNB match process in the world base records.
  • Involved in POC of project with team for implementing Mongo Db (no sql database) and elastic search (ELK) for quick results in searching process to retrieve customer details through service calls.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.

Environments: Informatica power center 9.6.1/ 9.5/9.1/8.6.1 , Informatica b2b data studio 9.6.1/ 9.5/9.1/8.6.1 , Orcale12c, PL/SQL, XML, Flat files, toad, UNIX, Splunk, Mongo Db 3.0, ELK (Elastic Search) 5.0.2,Control M Scheduler.

Senior Informatica Developer

Confidential, Lake Oswego, OR

Responsibilities:
  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets in corresponds to project coordination. Worked directly with Senior Architect and other team members to create ETL design specifications and support the implementation.
  • Created the Mappings and Mapplets Workflows for loading the data in stage tables and loading them to the EOP Info main tables through the Informatica power center 9.5 .1
  • Involved in creating the mappings and workflows for pulling the 835 files using B2B data studio to load the unstructured data into KPCC database to stage tables.
  • Involved in translating the functional requirements into technical mapping specifications and leveraged Informatica Power Center 9.5.1and B2B data studio to extract the data from 835 files and check files from the EPIC landing zone and load the Oracle tables and xml files to populate billing, payments to vendors on external claims for the provider, which are then loaded into the xml view tables.
  • Involved in creating XML mapping and Workflow in Informatica which generates EOP record for each Vendor/Subscriber/Member paid through the check cycle, by using data loaded into KPCC EOP tables. Partial information of this record will be stored to DASP Document table. This process then creates an EOP XML file for all the Vendor/Subscriber/Member EOPs in ETL landing zone.
  • Involved in creating the unix shell scripts for SFTP the 835 files to ETL Landing zone and generating the unix configuration file which is used in creating the XML on ETL Landing zone.
  • Extensively worked with team in a creating Unix Shell scripts process which is built to transfer the EOP Xml file from ETL Landing zone to NEDI Landing Zone.
  • Worked with Informatica Transformation processes to load the 835 CHK files into KPCC Schema by mapping the data to corresponding EOP tables in KPCC schema.
  • Involved in scheduling the unix and etl jobs with IBM Tovili Scheduler to run through TWS Scheduler after transferring the 835, CHK files successfully into Tapestry landing zone.
  • Involved in using the Created stored procedures, packages, triggers, tables, views, synonyms, and test data in Oracle.
  • Extensively worked on performance improvements using Push down optimization on source and target.
  • Worked DMS team to upload the EOP documents received from DST to store it in CARS team.
  • Used transformations like Source Qualifier, Aggregator, Filter, Router, Sequence Generator, look up, Rank, Joiner, Expression, Stored Procedure and update strategy to meet business logic in the mappings.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Created Mapplets, reusable transformations and used them in different mappings.
  • Created Workflows and used various tasks like Email, Event-wait and Event-raise, Timer, Scheduler, Control, Decision, Session in the workflow manager.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Extensively worked with Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
  • Developed, executed Test Plans and Test Cases to validate and check Referential integrity of data extracts before loading it to Data Warehouse.
  • FTP scripts and Extraction of Unstructured Data from 835 Data formats using Perl/Shell Scripts, XML/flat file formats also extracted successfully.
  • Extensively worked on SQL tuning process to increase the source qualifier throughput by analyzing the queries with explain plan, creating new indexes, partitions and Materialized views.
  • Designed and developed error handling mechanism in Informatica.
  • Extensively worked on the performance improvements by partitioning the session which improved the session performance by creating the multiple connections to sources and targets and loaded the data in parallel pipe lines.

Environment: Informatica Power Center 9.5/9.1/8.6.1 , Informatica Power Exchange 9.1/8.6,, Oracle 11g, PL/SQL, XML, Flat files, UNIX (AIX 5.3), Toad, Teradata, SQL Developer, SQL * Loader, AS400, DB2, Shell & Perl Scripting, HP Quality Center,IBM Tivoli scheduler

Senior Informatica Developer

Confidential, MO

Responsibilities:
  • Involved in Dimensional Data Modeling and populating the Business rules into the repository for metadata management.
  • Generated Surrogate key by using Key Management functionality for newly inserted rows in Data Warehouse and defined reference lookups and update transformations.
  • Created stored procedures, packages, triggers, tables, views, synonyms, and test data in Oracle.
  • Automated the load process using UNIX shell scripts.
  • Developed several Test Plans, Unix Scripts for Unit/Team Testing.
  • Used PVCS for version control and AutoSys for Job scheduling
  • Created complex stored procedures and functions to support efficient data storage and manipulation as well as database Replication.
  • Used most of the transformations such as the Source qualifier, Aggregators, Lookups, Filters, Sequence and Update strategy
  • Created new database objects like Stored Procedures, Functions, Packages, Triggers, Indexes and Views using T-SQL in Development and Production Environment for SQL Server.
  • Developed Database Triggers to enforce Data integrity and additional Referential Integrity.
  • Developed SQL Queries to fetch complex data from different tables in remote databases using joins, database links and formatted the results into reports and kept logs.
  • Involved in performance tuning and monitoring of both PL/SQL blocks.
  • Excelled in deployment and generation of Reports.
  • Preparing the reports for the day-to-day as well as weekly/monthly purposes.

Environment: Power Center 8.6.1/8.1, SQL Server 2005, Oracle 9i, Flat files, PVCS, Toad 8.0, Unix, Business Objects 5.1.

Informatica Developer

Confidential, Minnesota, MN 

Responsibilities:
  • Developed various Mappings with the collection of all Sources, Targets, and Transformations using Informatica Designer
  • Developed Mappings using Transformations like Expression, Filter, Joiner and Lookups for better data messaging and to migrate clean and consistent data
  • Extracted data from various sources across the organization (Oracle, SQL Server and Flat files ) and loading into staging area
  • Created and scheduled Sessions and Batch Process based on demand, run on time, or run only once using Informatica Workflow Manager and monitoring the data loads using the Workflow Monitor
  • Participated in Testing and performance tuning by identifying bottlenecks in mapping logic and resolving them, setting cache values and creating partitions for parallel processing of data
  • Developed Reports in Cognos Impromptu and Power play

Environment: Oracle 9i, SQL Server 2000, PL/SQL, Informatica Power Center 8.6.2/7.1/6.1 , Erwin, Cognos, Windows NT/2000, UNIX

Informatica Developer

Confidential, Irving, TX

Responsibilities:
  • Writing Oracle Stored Procedures, SQL scripts and calling them from Perl and shell scripts at pre and post session.
  • Collection of data source information from all the legacy systems and existing data stores.
  • Preparation of strategy for Data Conversion as per the business rules.
  • Developed complex mappings using multiple sources and targets in different databases, flat files.
  • Applied performance tuning on targets, sources, mappings and sessions to improve system performance.
  • Actively involved in gathering requirements and acquiring application knowledge from Business Managers & Application Owners.
  • Designed data model structure and E-R Modeling with all the related entities and relationship with each entity based on the rules provided by the business manager using Erwin.
  • Used workflow manager for session management, database connection management and scheduled the jobs to run in the batch process.
  • Deployment of Informatica jobs into testing, Validation and production environments.
  • Written and used UNIX shell scripts extensively for scheduling and pre/post session management
  • Involved in the performance tuning process by identifying and optimizing source, target, mapping and session bottlenecks.

Environment: Informatica Power Center 8.6.1/7.1.4/6.2 , Oracle 11g, PL/SQL, SQL Server 2005, T-SQL, SQL*Loader, Erwin 4.0, SQL Loader 11g, Win 2008, Perl & Korn scripts, SAP R3, Flat files, SAS Data, Toad, Solaris 10, Autosys, ODBC, Power Analyzer

ETL Developer

Confidential

Responsibilities:
  • Designed and developed data tables to store employee payroll information.
  • Interacted with end users to prepare system requirements.
  • Identify the data and activities that need to be maintained.
  • Experience in writing PL/SQL scripts, Stored Procedures and functions and debugging them.
  • Responsible for Migration of Stored Procedures into Informatica Mappings for improving Performance issue.
  • Involved in Performance Tuning of application by identifying bottlenecks in SQL, thus providing inputs to the application programmer, thereby correcting and implementing the right components.
  • Created Session Task, Email and Workflow to execute the mappings. Used Workflow Monitor to monitor the jobs, reviewed error logs that were generated for each session, and rectified any cause of failure.
  • Defined Target Load Order Plan for loading data into Target Tables.
  • Involved in the documentation of the ETL process with information about the various mappings, the order of execution for them and the dependencies.

Environment: Informatica Power Center 6.2, Oracle 9i, TOAD, Windows 2000, PL/SQL, MS Excel, IBM UDB DB2, Erwin, UNIX, Business Objects, Autosys.

We'd love your feedback!