We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

5.00/5 (Submit Your Rating)

Orlando, FL

SUMMARY

  • 6 years of IT Experience in requirement analysis, design, development and implementation of data warehousing and data marts projects using ETL tool Informatica
  • Having work experience inData Quality (DataCleansing & Conversion) and System Integration,DataMigration through ETL flows and have good exposure in the ETL andData QualityProcesses usingInformaticaData Quality(IDQ) andInformaticaPowercenter tools
  • Experience in of ETL/data integration experience in using Informatica Power Center 9.x/8.x/7.x
  • Experience in extracting data from source systems like Oracle, SQL server, Teradata and MS access and non - relational sources like flat files.
  • Experience in working with business analysts to identify and understand requirements and translate into ETL Requirement Documents during the Requirement Analysis phase.
  • Used Change Data Capture (CDC) Techniques like slowly changing target (type1, type2, and type3), slowly growing targets, and Simple pass through mapping using Power Center.
  • Well versed with SQL*Loader, Packages, Triggers, PL/SQL Development and Tuning Stored Procedures.
  • Hands on experience in interacting with the Clients and gathering the requirements for the modules and Enhancements.
  • Experience in creating High Level Design and Detailed Design documentation in the Design phase.
  • Good knowledge in full life cycle of ETL be it Informatica Power Center (Repository Manager, Mapping Designer, Workflow Manager & Workflow Monitor)
  • Expertise in using multiple Informatica transformations like Source Qualifier, Expression, Filter, Router, Rank, Sorter, Aggregator, Joiner, Look up, Update strategy, Sequence Generator etc.
  • Extensively worked on Power Center client: Repository Manager, Mapplet Designer, Mapping Designer, Workflow Manager and Monitor to extract, transform and load data
  • Efficient in troubleshooting mapping bottlenecks, performance tuning and debugging. Identifying and resolving performance bottlenecks in various levels like sources, targets, mappings and sessions.
  • Informatica Designer and processing tasks using Workflow Manager to move data from multiple sources into targets.
  • Worked on Slowly Changing Dimensions (SCD's) and implemented Type1, Type 2 (Flag and time stamp)
  • Used Maestro and Informatica Scheduler for scheduling batch cycle
  • Extensively worked on debugging Informatica mappings, mapplets, sessions and workflows.
  • Good understanding of Third normal form, Star and Snowflake Schema, Dimensional Modeling, Data Marts
  • Knowledge on Teradata Utility scripts like Fast Load, MultiLoad, and BTEQ to load data from various source systems to Teradata.
  • Successfully implemented Single Sign On Solution for Oracle RMS and other enterprise applications using Oracle IDM Suite (OAM OIM, PIM and API Gateway)
  • Successful implementation of medium to large scale Oracle Applications including database (Oracle RMS, Oracle ERP, Oracle SOA, BPM and IDM Suite) in HA and Cluster Infrastructure implementations
  • Hands on detailed expertise in Master Data Management (MDM) and legacy data migrations with ERP domain. I have adapted various methodologies like the Agile (Sprint), RUP, Waterfall in the areas of Project Planning, Functional Analysis & Documentation, Customer Master Data Migrations (MDM), Legacy to Active Conversions, GAP Analysis etc.
  • experienced in tech oriented modules BW/BI, LSMW, Workflow, ILM-Archiving, IXOS (Open Text), ALE, Security, SOLMAN, CHARM, HANA, etc.
  • Experience in Core banking application such as Flex cube 12.0.1, ACBS 6.0, SWIFT, PEGA payments and customers, East Net, Fed link, Actimize, SSB, Business object reports and BI reports.
  • Implemented Canada branch operations for Money Market, Foreign exchange, Funds transfer (Single currency and cross currency) modules and Swift payments.
  • Supported Business Objects reporting team by creating BO universe according to the end user requirements
  • Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.
  • Experience in post-production support, job monitoring using scheduling tools and incident management tools for incidents (Service Now).
  • Expertise in documenting the ETL process, status reports and meeting minutes.
  • Experience in creating Reusable Transformations to facilitate rapid development efforts.
  • Experienced in UNIX work environment, file transfers SFTP, job scheduling and error handling.
  • Extensive functional and technical exposure. Experience working on high-visibility projects
  • Excellent analytical/ communication skills and a good team player.
  • An Oracle/Data Warehouse Architect/Data Architect/Data Modeler/Enterprise Data Architect & Oracle DBA with rich experience with maintain Databases of different sizes.
  • Recently provided Data Architect, Database Re-Engineering, Scalability/Performance contract/consulting services on IBM ZOS, ZVM/VSE, and p-Series/AIX platforms.
  • Skilled in logical and physical database design. Experienced in logical data modeling including business rules.
  • Worked on EDW/MDM Building Data marts Erwin, Erwin Model Mart in multi-user environment with for normalized and dimensional data structure.
  • Implemented and maintained web and distributed Enterprise applications using JavaScript, HTML, CSS, JSP, REST, JSON, JQuery, WCAG, and AJAX that follows W3C Web Standards

TECHNICAL EXPERIENCE:

ETL Tools: Informatica power center 9.x/8.x

Databases: Oracle, SQL Server and Teradata

Database Tools & Utilities: SQL*PLUS, SQL developer, SSMS, Teradata SQL Assistant, MLOAD, FLOAD, BTEQ

Other Tools: MS-Office, putty, Archiving/ILM

Operating systems: UNIX, Linux, Windows

Programming Skills: UNIX Shell scripting, SQL, SQL*PLUS, PL/SQL

Scheduling & ERP Tool: Maestro, Informatica Scheduler, Client - Citrix

PROFESSIONAL EXPERIENCE

Sr. Informatica Developer

Confidential, Orlando, FL

Responsibilities:

  • Analysistherequirementanddesignthedocument
  • Analyze current Python based ETL application which is written using BTEQs to understand the ETL transformations.
  • Based on the source to target mappings document provided by business analysts and analysis of current application, designed ETL transformation to rewrite in Informatica
  • Integrated the data from various data sources such as Oracle, DB2 and flat files
  • Effectively worked on Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Extensively used Sequence Generator in all mappings and fixed bugs / tickets raised in production for existing mappings in common folder for new files through versioning (Check in and Check out) on an urgency through support for QA in component unit testing and validation.
  • Used shortcuts for sources, targets, transformations, mapplets, and sessions to reuse objects without creating multiple objects in the repository and inherit changes made to the source automatically.
  • Applied slowly changing Dimensions Type I and Type II on business requirements.
  • Effectively understood session error logs and used debugger to test mapping and fixed bugs in DEV in following change procedures and validation
  • Fine-tuned ETL processes by considering mapping and session performance issues.
  • Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
  • Extensively worked on performance tuning and also in isolating header and footer in single file.
  • Working with large amounts of data independently executing data analysis, utilizing appropriate tools and techniques (Interpreting results and presenting them to both internal and external client).
  • Informatica power exchange is used to replicate the data from source database to staging.
  • Writing SQL queries to create end-user reports /Developing SQL Queries and stored procedures in support of ongoing work and application support.
  • Designing and executing test scripts and test scenarios, reconciling data between multiple data sources and systems.
  • Involved in requirement gathering, Design, testing, project coordination and migration.
  • Project planning and scoping, facilitating meetings for project phases, deliverables, escalations and approval. Ensure adherence to SDLC and project plan.
  • Raised change requests, analyzed and coordinated resolution of program flaws and fixed them in DEV and Pre-Production environments, during the subsequent runs and PROD.
  • Perform analysis profiling on existing data and identify root causes for data inaccuracies, Impact Analysis and recommendation of Data Quality.
  • Precisely documented mappings to ETL Technical Specification document for all stages for future reference.
  • Scheduled jobs for running daily, weekly and monthly loads through control-M for each workflow in a sequence with command and event tasks.
  • Created requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.
  • Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
  • DeploythecodeindifferentenvironmentlikeST,SIT,UAT&Prod
  • Maintained the proper communication between other teams and client.

Environment: SQL, PL/SQL, UNIX, Shell Scripting, HP Quality Centre 10, Informatica Power Center 9.xControl-M.

Sr. Informatica ELT Developer

Confidential, Minneapolis, MN

Responsibilities:

  • Developed complex mappings inInformaticato load the data from source tables using different transformations like Source Qualifier, look up (connected and unconnected), Expression, Aggregate, Update Strategy, stored procedure, Joiner, Xml, Filter, Sorter and Router.
  • Installation ofInformatica9.0.1 hotfix 2 environments and 7.8.4 contents back up.
  • Sand box environment created for regression/Load testing.
  • Informatica7.8.4 Repository Backup and restore.
  • Unit test case creation along with test data creation for testing the code
  • Folder migration from old repository to upgraded repository.
  • OBIEE Reports creation and user prompts creation.
  • DAC Scheduler monitoring.
  • End to end ware house upgrade from Siebel analytics 7.8.4 to OBIA apps 7.9.6.3
  • Work assignment to the offshore team and regular status monitoring.
  • Data migration to upgraded tables from obsolete tables. Using vanilla UPG Informaticarepository.
  • Configuring Infa sequence generator.bat file during data migration
  • Unit test case preparation and data validation.
  • Retro fixingInformaticamappings in order to replace obsolete tables
  • Data ware house full load using DAC scheduler, load monitoring and fixing issues if any.
  • SCD type mappings creation using different transformations.
  • Worked on Data Masking from hiding the original data by encrypting, using alias names at the object level
  • Data analysis from SIEBEL source to ware house to BI front end.
  • Unit testing, surface testing, integration testing and sandbox testing.
  • Dry run activities before go live and support activities.
  • Developed Mapplets to implement business rules that involved complex logic.
  • Tuned the mappings and sessions for better performance by eliminating various performance bottlenecks.

Environment:InformaticaPower Center 9.0.1, SQL, PL/SQL, Toad, MS SQL 2008, DAC 10.1.3.4.1, OBIA apps 7.9.6.3

Informatica Developer/SME

Confidential, Richmond, VA

Responsibilities:

  • Created complex mapping using transformations like, SQLT, Update strategy, Joiner, Lookups, Sequence generator, reusable sessions and mapplets.
  • Created mapping, session, workflow parameters and variables. Designed intraday process incremental load using mapping variable.
  • Implemented various Transformations: Joiner, sorter, Aggregate, Expression, Lookup, Filter, Update Strategy and Router.
  • Developed and Scheduled jobs for daily loads that invoked Informatica workflows & sessions associated with the mappings.
  • Modified the existing mappings for the updated logic and better performance.
  • Avoided Duplicate issues through SQL override using look up, Joiner transformations.
  • Performed Unit Testing on the mappings based on HLD, LLD and codes developed and developed plans based on pending transaction strategies on credit cards.
  • Tested Informatica mappings and workflows, PL/SQL procedures and worked extensively with Informatica for data sourcing, data transformation and data loading.
  • Providing SME (Subject Matter Expert) support for Confidential Enterprise Data ware house Application.
  • Responding to all Business Queries. Standardizing application production support across all of Confidential business applications.
  • It deals with the Enhancement/Maintenance of the applications used by the banking clients.
  • Institutionalizing the processes that are continuously improved and optimized by process action teams.
  • Preparing response strategies for quick resolution of breakdowns.
  • Created test case scenarios, executed test cases and maintained defects in internal bug tracking systems.
  • Used Error Handling to capture error data in tables for handling nulls, analysis and error remediation Process.
  • Involved in massive data cleansing and data profiling of the production data load.
  • Defect management, bug tracking and bug Reporting using Quality Center.
  • Performing System maintenance activities, maintain all application related activities etc.

Environment: SQL, PL/SQL, Teradata, UNIX, Shell Scripting, HP Quality Center 10, Informatica Power Center 9.x

ETL Informatica Developer

Confidential

Responsibilities:

  • Involved in requirement gathering, Design, testing, project coordination and migration.
  • Used most of the transformations such as the Aggregators, Filters, Routers, Sequence Generator Update Strategy, Rank, Expression, lookups (connected and unconnected), Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
  • Created mappings using transformations such as source qualifier, aggregator, expression, lookup, and router, filter and update strategy.
  • Implemented Slowly Changing Dimensions- Type I & II in different mappings as per the requirement.
  • Scheduled and Run Extraction and Load process and monitor workflows using workflow monitor.
  • Responsible for Creating workflows and Worklets. Created Session, Event, Command, Control, Decision and Email tasks in Workflow Manager.
  • Maintain the unit test cases and system testing on the mappings, sessions and finally observe the execution of workflows from the workflow monitor.
  • Created requirement specifications documents, user interface guides, and functional specification documents, ETL technical specifications document and test case.
  • Understanding the Business requirements, Source system and as part of understand the different logical models, then get the knowledge of the Extract, Transformation, Loading specifications of the project.
  • Used ETL to standardize data from various sources and load into data stage area, which was in oracle and stage to different Data Marts.
  • Extensively used ETL to load data from various source like Oracle, flat files and SQL Server to Target warehouse Database on Oracle DB.
  • Error log design, data load strategy, unit and system testing, system migration and job schedules.
  • Fine-tuned ETL processes by considering mapping and session performance issues.

Environment: SQL, PL/SQL, Teradata, UNIX, Shell Scripting, HP Quality Center 10, Informatica Power Center 8.x

We'd love your feedback!