We provide IT Staff Augmentation Services!

Sr.informatica Developer Resume

5.00/5 (Submit Your Rating)

Bloomington, IL

SUMMARY

  • Around 8 Years of experience in Data warehousing, Development, maintenance, enhancements and production support projects as Informatica Developer, Support, Analyst.
  • Extensively used ETL methodology for supporting Data Extraction, Transformation and Loading processing, in a corporate - wide-ETL Solution using Informatica Power Center 9.x/8.x, UNIX, Oracle 11g/10g/9i, DB2, Teradata, Netezza and SQL Server.
  • Strong Data Warehousing ETL experience of using Informatica 9.6.1/9.5.1/ 9.0.1/8.6.1 /8.1.1 PowerCenter Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Experience in working with creating mappings, mapplets, DataProfiling, Score Card, Classifier models, probabilistic models, Human task, Exception record management, Data Masking and used IDQ Specific transformations like Key Generator, labeler and standardizer transformations in IDQ.
  • Strong experience in InformaticaData Quality (IDQ), Power Center, Data Cleansing, Data profiling, Data quality measurement and Data validation processing.
  • Prominent experience in Data Integration, Extraction, Transformation and Loading data from multiple heterogeneous data source systems like Oracle, SYBASE, SQL Server, DB2, Teradata, XML and Flat files(fixed width, delimited) into Data warehouse by using Informatica tool.
  • Successfully Integrated Multiple XML sources and created a de-normalized, flat-structured file.
  • Expertise working on slowly changing dimension type1 and type2.
  • Experience in debugging mappings. Identified bugs in existing mappings by analyzing the data flow and evaluating transformations using Informatica debugger.
  • Hands on experience in UNIX(Korn and Bash) shell scripting.
  • Experience in designing and implementing partitioning to improve performance while loading large data volume.
  • Working knowledge on reporting tools like Microsoft SQL Server Reporting Services (SSRS), OBIEE, Cognos BI .
  • Expertise in Change Data Capture (CDC) using Informatica Power Exchange.
  • Expertise in implementing Slowly Changing Dimensions (SCD) Type1, Type2 and Type3.
  • Strong understanding of RDBMS, OLAP, OLTP concepts and experience in writing complex PL/SQL and SQL statements in databases.
  • Strong understanding of BI Software development life cycle (SDLC) methodologies including Waterfall and Agile.
  • Profound knowledge about the architecture of the Teradata database. Developed Teradata Loading and Unloading utilities like Fast Export, Fast Load, and Multiload (Mload).
  • Experience in implementing MDM software solutions with Informatica MDM formerly, Siperian.
  • Strong exposure to working on scheduling tools like AutoSys and Control-M.
  • Experience in creation of UNIT test plans, SIT scripts, UAT scripts for testing.
  • Well versed with HIPAA, Facets, claim adjustments, claim processing from point of entry to finalizing, claim review, identifying claims processing problems, their source and providing corresponding solutions.
  • Experience in developing the Web services using SOAP for MDM operations
  • Complete knowledge of data ware house methodologies (Ralph Kimball, Inmon ), ODS, EDW and Metadata repository.
  • Strong experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using ERwin and ER-Studio.
  • Hands on experience in Informatica MDM Hub Match and Merge Rules, Batch Jobs and Batch Groups.
  • Good knowledge in development, implementation, administration and support of ETL processes for large-scale Data Warehouses using Data Transformation Services (DTS) and SSIS with MS SQL 2008/2005/2000.
  • Onsite Coordinator for entire Data Migration team, responsible for requirements gathering, preparing mapping specifications, analysis on requirement gaps and data model.
  • Participated in the daily scrum meetings for reporting updates on onsite work as well as the offshore development team.
  • Excellent problem-solving and trouble-shooting capabilities. Quick learner, highly motivated, result oriented and an enthusiastic team player.
  • Good interpersonal skills, experience in handling communication and interactions between different teams.

TECHNICAL SKILLS

ETL Tools: Informatica 9.x/8.x, Informatica MDM, IDQ, Power Exchange.

OS: Sun Solaris 2.x/7/8, HP-UX, IBM AIX, Unix, Linux, MS-DOSWindows 2000/2003/NT/XP/Vista/7.

Databases: Oracle (8.0/8i/9i/10g/11g),SQL Server2008/2005, Sybase 11.5, XML,DB2, Netezza, Teradata

Programming Skills: C, C++, Unix(Korn and Bash) Shell Scripting, PL/Sql, JAVA,J2EE, HTML,JCL

Db Tools: SQL*PLUS, SQL*LOADER, Quest TOAD, SQL Navigator, SQL developer, SQL server management studio, SQL Assistant

DM Tools: Erwin, ER Studio Data Architect, MS Visio

BI Tools: IBM Cognos, Business Objects, OBIEE

PROFESSIONAL EXPERIENCE

Confidential, Bloomington, IL

Sr.Informatica Developer

Responsibilities:

  • Actively involved in meetings with data modeler and business users to understand the existing environment and gathered the requirements to create the DataMart.
  • Designed ER diagrams, logical model (relationship, cardinality, attributes, and, candidate keys) and physical database (capacity planning, object creation and aggregation strategies) for Oracle as per business requirements using Erwin.
  • Involved in design and development of parallel jobs, sequences using the Designer.
  • Designed several parallel jobs using Sequential File, Dataset, Join, Merge, Lookup, Change Apply, Change Capture, Remove duplicates, Funnel, Filter, Copy, Column Generator, Peek, Modify, Compare, Oracle Enterprise, Surrogate Key, Aggregator, Transformer, Row Generator stages.
  • Worked with SCD stage for implementing slowly changing dimensions. Extracted data from Flat Files and Oracle databases and applies business logic to load them in the central Oracle database.
  • Providing the advanced business rule writing skills via IDQ Developer and our skilled technicians.
  • Develops map, reusable objects, transformation, and Mapplets using Mapping Designer, transformation developer and Mapplets Designer in InformaticaPowerCenter 9.6.1.
  • Expertise in Exporting IDQ mappings into Informatica designer and creating tasks in workflow manager and scheduling the mapping with scheduling tools to run these mappings.
  • Creates reusable transformations and Mapplets and uses them in mappings.
  • Involved in creating the mapping for change data capture using MD5 function.
  • Created workflows, tasks, database connections, FTP connections using workflow manager.
  • UsedInformaticaPowerCenter 9.6.1. for extraction, loading and transformation (ETL) of data in the data warehouse.
  • Configured, Developed and implemented MDM end to end solution.
  • Works with data modelers to prepare logical and physical data models and adds and deletes necessary fields using Erwin.
  • Responsible for implementing the Informatica CDC logic.
  • Tuned DataStage jobs for better performance by creating DataStage Lookup files for staging the data and lookups.
  • Designed and developed client centric Master Data Management (MDM) solution.
  • Owning all the code migrations/reviews includes Informatica, UNIX shell scripts, and DB objects.
  • Leveraged used of Informatica transformations such as web service transformations, Java transformations to handle XML and string related parsing complex scenarios.
  • Used the Manager to import, export jobs and routines.
  • Created Shared Containers to increase Object Code Reusability and to increase throughput of the system.
  • Experience developing complex transformations, surrogate keys, routines, dimension tables and fact tables.
  • Used Environment Variables, Stage Variables and Routines for developing Parameter Driven Jobs and debugging them.
  • Performed match/merge and ran match rules to check the effectiveness of MDM process on data.
  • Expertise in Exporting IDQ mappings into Informatica designer and creating tasks in workflow manager and scheduling the mapping with scheduling tools to run these mappings.
  • Designed the job templates and created new jobs from existing templates.
  • Attended production implementation calls and coordinated with the DBA’s during migration of the code.
  • Enhanced the Job Performance by using proper Partitioning methods and analyzing the resources utilized using Job Monitor.
  • Created Master Job Sequencers to control sequence of Jobs using job controls.
  • Used Data Stage Director for running and monitoring performance statistics.
  • Created Test Plans for Unit and Integration Testing for designed jobs. Assisted in UAT Testing and provided necessary reports to the business users.

Environment: InformaticaPowerCenter 9.6.1 (Repository Manger, Designer, Workflow Monitor,WorkflowManager)IBMInfoSphereDataStage8.1/8.0,Autosys,Erwin,Oracle11g/10g,Siebel, Informatica IDQ,MDM, EIM, starteam,JCL, Teradata, Netezza, TOAD, Java, SQL*Plus, Windows, UNIX Shell Script,Putty.

Confidential, Chicago, IL

ETL/Informatica Developer

Responsibilities:

  • Participated in user meetings, gathered Business requirements & specifications for the Data-warehouse design. Translated the user inputs into ETL design docs.
  • Developed data model in MDM using schema viewer.
  • Involved in the creation of Informatica mappings to extracting data from oracle, Flat Files to load in to Stage area. Worked on code/bug fix of existing PL/SQL procedures and packages
  • Developed mappings based on use cases involving authenticated webservices (Both Rest and SOAP).
  • Worked data mapping, data cleansing, program development for loads, and data verification of converted data to legacy data.
  • Involved in error handling, performance tuning of mappings, testing of Stored Procedures and Functions, Testing of Informatica Sessions, and the Target Data.
  • Possess expertise with relational database concepts, stored procedures, functions, triggers and scalability analysis.
  • Optimized the SQL and PL/SQL queries by using different tuning techniques like using hints, parallel processing, and optimization rules.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Testing and debugging of all ETL objects in order to evaluate the performance and to check whether the code is meeting the business requirement.
  • Responsible for migrations of the code from Development environment to QA and QA to Production.
  • Worked withInformaticaData Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion,address standardization, exception handling, and reporting and monitoring capabilities of IDQ.
  • Implemented MDM in testing the data quality services and good working knowledge on MDM.
  • Identified and eliminated duplicates in datasets thorough IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher, It enables the creation of a single view of customers, help control costs associated with mailing lists by preventing multiple pieces of mail.
  • Expertise in implementing Profiling, Score Card, Classifier models, probabilistic models, Human task, Exception record management as part of IDQ process.
  • Involved in generating reports using Cognos.
  • Expertise in Exporting IDQ mappings into Informatica designer and creating tasks in workflow manager and scheduling the mapping with scheduling tools to run these mappings.
  • Carried out Defect Analysis and fixing of bugs raised by the Users, involved in support and troubleshooting of production systems as required, optimizing performance, resolving production problems, and providing timely follow-up on problem reports

Environment: InformaticaPowerCenter 9.1.0/9.5.1 (Repository Manger, Designer, Workflow Monitor, WorkflowManager)IBMInfoSphere DataStage8.1/8.0,Cognos,TOAD, Erwin, IDQ, Autosys, Siebel, Oracle11g/10g, Teradata, starteam, TOAD,SQL*Plus, Windows, UNIX, Shell Script.

Confidential, Fort Worth, TX

ETL/Informatica Developer

Responsibilities:

  • Interacted with Business Analysts to finalize the requirements and documented the technical design document for Coding.
  • Developed DataStage Parallel Jobs using required stages, and the obtained data from different sources were formatted, Cleansed, summarized, aggregated, transformed and loaded into data warehouse.
  • Involved in design and development of parallel jobs, sequences using the Designer.
  • Designed several parallel jobs using Sequential File, Dataset, Join, Merge, Lookup, Change Apply, Change Capture, Remove duplicates, Funnel, Filter, Copy, Column Generator, Peek, Modify, Compare, Oracle Enterprise, Surrogate Key, Aggregator, Transformer, Row Generator stages.
  • Developed Metadata driven code for effective utilization and maintenance using technical metadata, business metadata and process metadata.
  • Experience in creating base objects in MDM.
  • Performed performance tuning by identifying the bottlenecks in informatica mappings and sessions and also using explain plan in oracle using TOAD.
  • Utilized InformaticaIDQ.
  • Data Analyst, Developer for dataprofiling and matching/removing duplicate data,fixing the bad data,fixing NULL values.
  • Extensively worked with all the new features in IBM information Server.
  • Developed wrapper webservices for managing data transfer from SQL database of internal systems to third party interface webservices.
  • Worked with SCD stage for implementing slowly changing dimensions. Extracts data from Flat Files and Oracle databases and applies business logic to load them in the central Oracle database.
  • Worked on fulfilling/responding to business requirements from clients, preparing for business workshops, project kickoffs, data warehouse assessments, Informatica and MDM initiatives, data profiling and data quality workshops using Informatica Data Explorer and Informatica Data Quality.
  • Upgrades from Informatica version 8.6.1 to 9.0.1
  • Efficiently implemented Change Data Capture (CDC) to extract information from numerous SQL Server Tables.
  • Design and develop Business Intelligence solutions using Data tools (SSRS, SSAS, SSIS)
  • Create reusable transformations and Mapplets and uses them in mappings.
  • Works with data modelers to prepare logical and physical data models and adds and deletes necessary fields using Erwin.
  • Developed scripts in BTEQ to import and export the data. Generated status reports using Workflow Manager.
  • Used UNIX Shell Script for data extraction, running Pre-Session, Post-Session and PL/SQL procedures.
  • Used debugger to test the data flow and fixed the mappings.
  • Designed the job templates and created new jobs from existing templates.
  • Enhanced the Job Performance by using proper Partitioning methods and analyzing the resources utilized using Job Monitor.
  • Used Data Stage Director for running and monitoring performance statistics.
  • Worked with Onsite Off-shore team and daily reviews with Off shore team leads to check if there are any missing deadlines.
  • Created Test Plans for Unit Testing for designed jobs.

Environment: Informatica Power Center 9.0.1/8.6.1, Power Exchange 9.0.1/8.6.1, MDM, Oracle 10g, PL/SQL, UNIX, Toad, Flat files, Control M, Quality Center.

Confidential

ETL Developer

Responsibilities:

  • Used transformations such as source qualifier, aggregator, expression, lookup, router, filter, update strategy, joiner, transaction control and stored procedure.
  • Developed and tuned Informatica mappings and Mapplets for optimum performance, dependencies and batch design.
  • Worked with Informatica source analyzer, warehouse designer, transformation developer, Mapplets Designer, Mapping Designer, repository manager, workflow manager, workflow monitor, repository server and Informatica server to load data from Flat Files, legacy data.
  • Designed mappings between sources (external files and databases) to operational staging targets.
  • Worked with pre and post sessions, and extracted data from transaction system into staging area.
  • Identified facts and dimensions tables.
  • Tuned sources, targets, mappings and sessions to improve the performance of data load.
  • Wrote Teradata SQL queries according to process need.
  • Used Teradata SQL assistant and manager for database.
  • Generated reports using Teradata BTEQ.
  • Wrote BTEQ scripts to extract data.
  • Consumed web service to retrieve user credentials from LDAP server.
  • Designed dynamic SSIS packages to transfer data crossing different platforms, validate data and data clearing during transferring, and archived data files for different DBMS.
  • Deployed SSIS packages and scheduled jobs.
  • Established loading process using MLOAD, FLOAD, and BTEQ.
  • Designed mapping templates to specify high level approach.
  • Experience in working with user exits using Java.
  • Worked with the reporting team to generate the SSRS reports from the loaded data.
  • Performed unit testing and documentation.
  • Provided production support for business users and documented problems and solutions to run the work flow.

Environment: Informatica 8.1.1/8.6.1, Teradata V2R5/V2R6, Oracle 10g, Mercury Quality Center 9.0, SSIS,SSRS, Java, Perl Wrapper Script, SQL, XML, Web Services, UNIX HP-UX

Confidential

Data Warehouse Developer

Responsibilities:

  • Involved in design & development of operational data source and data marts in Oracle
  • Reviewed source data and recommend data acquisition and transformation strategy
  • Involved in conceptual, logical and physical data modeling and used star schema in designing the data warehouse
  • Designed ETL process using Informatica Designer to load the data from various source databases and flat files to target data warehouse in Oracle
  • Used Power mart Workflow Manager to design sessions, event wait/raise, and assignment, e-mail, and command to execute mappings.
  • Created parameter based mappings, Router and lookup transformations
  • Created mapplets to reuse the transformation in several mappings
  • Used Power mart Workflow Monitor to monitor the workflows.
  • Worked on Complex SQL- queries and tuned them to improve performance
  • Optimized mappings using transformation features like Aggregator, filter, Joiner, Expression, Lookups
  • Worked extensively on Oracle database 9i and flat files as data sources.
  • Performed integrated testing for various mappings. Tested the data and data integrity among various sources and targets.
  • Created daily and weekly workflows and scheduled to run based on business needs
  • Created Daily/weekly ETL process which maintained 100GB of data in target database
  • Created complex stored procedures/functions/triggers/query tuning in MS Access 97

Environment: Oracle 9i, Informatica Power Center 8.1.1, Windows NT, UNIX, SQL Server, PL/SQL

We'd love your feedback!