We provide IT Staff Augmentation Services!

Informatica Developer Resume

3.00/5 (Submit Your Rating)

Raleigh, NC

SUMMARY

  • Around 9 years of IT experience in Business Intelligence and Data Warehousing applications.
  • Around 7+ years of experience in ETL of data from various sources into Data Warehouses and Data Marts using Informatica Power Center 10.2/ 9.6/ 9.5.1/9.1/9.0.1/8. x.
  • Excellent Working Knowledge in Dimensional modeling using multi - dimensional models (Star, Snow Flake schema).
  • Hands-on knowledge of RDBMS platforms and relational database architecture.
  • Experience working with Informatica PowerExchange, Informatica ILM, Informatica DVO, Informatica IDQ, Informatica Analyst, Informatica Metadata Manager, TDM.
  • 2+ years of experience using Business Objects XIR2/6.x/5.x
  • Database experience of Oracle 11g/10g/9i/8i/7.x, Teradata V12/V13, SQL Server, IBM DB2.
  • Thorough knowledge in Teradata architecture, extensively worked on BTEQ, MLOAD, FLOAD, SQL Assistant to design and develop dataflow paths for loading transforming and maintaining data warehouse.
  • Strong Knowledge in all phases of Software Development Life Cycle (SDLC) on various platforms like UNIX, Windows.
  • Experience as Informatica Administrator in 8.x and 9.x and Informatica Production support.
  • Expertise in design and development of mappings to handle all types of slowly changing dimensions SCD type 1, type2 and type3.
  • Experience in transforming data from various types of sources like flat files, Oracle, Teradata, SQL server and XML documents.
  • Develop TDM capabilities in the project using Informatica TDM, Informatica PC and Informatica PWX.
  • Extensive experience in Performance tuning of sources, targets, mappings, sessions, and SQL queries.
  • Extensive Expertise with error handling and Batch processing.
  • Experience with Push down optimization, Partitioning for better performance.
  • Strong knowledge of Software Development Life Cycle (SDLC) with industry standard methodologies like Waterfall, Agile, and Scrum methodology including Requirement analysis, Design, Development, Testing, Support and Implementation. Provided End User Training and Support.
  • Extensive experience with TOAD, SQL Developer and SQL Navigator, Teradata SQL Assistant, SQL Server Management to test, modify and analyze data, create indexes, and compare data from different schemas.
  • Extensive experience in Shell scripting.
  • Good experience in creating procedures, functions, triggers and cursors using PL/SQL.
  • Expertise in performing Unit Testing, Integration Testing and Regression Testing.
  • Mentors and assists developers, business analysts.
  • Extensive work experience on Onsite offshore model.
  • Excellent communication and interpersonal skills, very good team player and self-starter with ability to work independently and as part of a team, strong effort, time management, good analytical reasoning and high adaptability to new technologies and tools.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.X/ 9 .6/9.5.1/9.1/9.0.1 , Informatica BDE, Informatica Power Exchange 9.x/8.x/7.x, TDM, ILM, IDQ, DVO, Analyst, metadata manager.

OLAP Tools: Business Objects XIR2,6.x,5.x (Supervisor, Designer, Full Client, Web intelligence), Crystal Reports 7.0/8.0/9.0/10/ XI/XIR2

Languages: SQL, PL/SQL, Unix Scripting, C++, Java, XML, Toad.

Operating Systems: Windows, UNIX, Linux

Databases: Oracle 11g/10g/9i/8i/7.x, SQL*Plus, Netezza 7.2, Teradata ( BTEQ, Fast load, multi load, TPump, SQL Assistant, Fast Export), S SQL Server, IBM DB2.

Data Modeling: Erwin, Star-Schema Modeling, Snowflakes Modeling, FACT & DIMENSION Tables, MS Visio

PROFESSIONAL EXPERIENCE

Informatica Developer

Confidential, Raleigh, NC

Responsibilities:

  • Worked with business analysts to identify appropriate sources for data warehouse and prepared the Business Release Documents, documented business rules, functional and technical designs, test cases, and user guides.
  • Collaborated with customer on building solutions in database, data management, and analytics using RDBMS technology.
  • Actively involved in the Design and development of the STAR schema data model.
  • Implemented slowly changing and rapidly changing dimension methodologies, created aggregate fact tables for the creation of ad-hoc reports.
  • Extensively worked on Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregator, Filter, Sequence Generator, etc.
  • Worked extensively on data integration into Google Cloud using IICS.
  • Written big queries against Google Cloud Relational DB.
  • Worked on TDM (Test Data Management) project performing masking part.
  • Develop TDM capabilities in the project using Informatica TDM, Informatica PC and Informatica PWX.
  • Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.
  • Mentored Informatica developers on project for development, implementation, performance tuning of mappings and code reviews.
  • Validated redshift data using workbench aginity.
  • Used SQL tools like Aginity Workbench, AQT, TOAD to run SQL queries and validate the data in warehouse and mart.
  • Developed Informatica mappings/mapplets, sessions, Workflows for data loads and automated data loads using UNIX shell scripts.
  • Used various lookup caches like Static, Dynamic, Persistent and Non-Persistent in Lookup transformation.
  • Involved in debugging mappings, recovering sessions, and developing error-handling methods.
  • Successfully migrated objects to the production environment while providing both technical and functional support.
  • Used TPT utility to load the data into Teradata.
  • Used Power exchange CDC (change data capture) for pulling from oracle source.
  • Developed E-MAIL tasks to send mails to production support and operations.
  • Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.
  • Created mappings in Data Stage Server using transformer, containers etc.
  • Worked on Pentaho Kettle for loading data into ODS from pipe delimited flat files.
  • Designed and developed UNIX Scripts to automate the tasks.
  • Worked with Teradata FLoad and MLoad utilities of Teradata.
  • Resolved memory related issues like DTM buffer size, cache size to optimize session runs.
  • Performed Loading operation of historical data using full load and incremental load into Enterprise Data Warehouse.
  • Worked on Migration of mappings from Data Stage to Informatica.
  • Extensively worked on Power Center 9.5.1 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite (EBS).
  • Developed various T-SQL stored procedures, functions, and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL,
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2012.
  • Provided architectural design of Netezza to the client.
  • Created the format of the unit test documents per Netezza Framework.
  • Did data validations in Informatica mappings and loaded the target validation tables.
  • Developed Custom Logging so user can know when a row is inserted in custom logging table by every SSIS package that executes.

Environment: Informatica Power Center 10.2, Informatica Data Quality 10.X, Salesforce, Google Cloud, SQL Server 2017, Oracle 12 c, OBIEE, PL/SQL, T-SQL, Flat Files, RedHat, Netezza TwinFin 3/6/skimmer.

Informatica Developer.

Confidential

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Involved in the development of Informatica mappings and also tuned for better performance.
  • Migration of data from SQL Server to Oracle Exadata
  • Created Oracle Exadata database, users, base table and views using proper distribution key structure.
  • Worked on Data integration from various sources into ODS using ODI (oracle data Integrator)
  • Configured power exchange for SAP R3
  • Retrieved data from SAP R3.
  • Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
  • Calculated the KPI’s and worked with the end users for OBIEE report changes.
  • Created the RPD for OBIEE.
  • Developed mapping parameters and variables to support connection for the target database as Oracle Exadata and source database as Oracle OLTP database.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and very detailed document of all the projects with the workflows and their dependencies.
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2012.
  • Optimized the T-SQL queries and converted PL-SQL code to T-SQL.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Retrieve data from different delimited flat files,
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Loaded data into salesforce using SOAP, REST API.
  • Worked with Crontab for job scheduling.
  • Production Support and issue resolutions.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL over ride in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.
  • Written Unix Shell Scripting for repository backups, job scheduling on Crontab etc.

Environment: Informatica Power Center 10.2, Informatica Power Exchange 10.X, Informatica Data Quality 10.X, SAP R3, ODI (Oracle Data Integrator), Salesforce, SQL Server 2017, AWS Cloud, AWS Red Shift, Oracle 12 c, Oracle Exadata, OBIEE, PL/SQL, Flat Files, Linux, Putty, Winscp.

Informatica Developer.

Confidential, Tucson, AZ

Responsibilities:

  • Worked with business analysts to identify appropriate sources for data warehouse and prepared the Business Release Documents, documented business rules, functional and technical designs, test cases, and user guides.
  • Worked on Jasper 6.1 on analytics and reporting.
  • Configured and changed the Jasper dashboard for different requirements to the reporting.
  • Worked on populating the reporting layer for Jasper 6.1.
  • Collaborated with customer on building solutions in database, data management, and analytics using RDBMS technology.
  • Actively involved in the Design and development of the STAR schema data model.
  • Implemented slowly changing and rapidly changing dimension methodologies, created aggregate fact tables for the creation of ad-hoc reports.
  • Extensively worked on Connected & Unconnected Lookups, Router, Expressions, Source Qualifier, Aggregator, Filter, Sequence Generator, etc.
  • Worked on TDM (Test Data Management) project performing masking part.
  • Develop TDM capabilities in the project using Informatica TDM, Informatica PC and Informatica PWX.
  • Created and maintained surrogate keys on the master tables to handle SCD type 2 changes effectively.
  • Mentored Informatica developers on project for development, implementation, performance tuning of mappings and code reviews.
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse and mart.
  • Developed Informatica mappings/mapplets, sessions, Workflows for data loads and automated data loads using UNIX shell scripts.
  • Used various lookup caches like Static, Dynamic, Persistent and Non-Persistent in Lookup transformation.
  • Involved in debugging mappings, recovering sessions and developing error-handling methods.
  • Successfully migrated objects to the production environment while providing both technical and functional support.
  • Used TPT utility to load the data into Teradata.
  • Used Power exchange CDC (change data capture) for pulling from oracle source.
  • Developed E-MAIL tasks to send mails to production support and operations.
  • Used Session Parameters to increase the efficiency of the sessions in the Workflow Manager.
  • Created mappings in Data Stage Server using transformer, containers etc.
  • Worked on Pentaho Kettle for loading data into ODS from pipe delimited flat files.
  • Designed and developed UNIX Scripts to automate the tasks.
  • Worked with Teradata FLoad and MLoad utilities of Teradata.
  • Resolved memory related issues like DTM buffer size, cache size to optimize session runs.
  • Worked on SQL Server Integration Services (SSIS) to integrate and analyze data from multiple homogeneous and heterogeneous information sources (CSV, Excel, Oracle db).
  • Performed Loading operation of historical data using full load and incremental load into Enterprise Data Warehouse.
  • Created SSIS packages to Extract, Transform and load data using different transformations such as Lookup, Derived Columns, Condition Split, Aggregate, Pivot Transformation, and Slowly Changing Dimension, Merge Join and Union all.
  • Worked on Migration of mappings from Data Stage to Informatica.
  • Extensively worked on Power Center 9.5.1 Designer client tools like Source Analyzer, Target Designer, Mapping Designer, Mapplet Designer and Transformation Developer.
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite (EBS).
  • Did data validations in Informatica mappings and loaded the target validation tables.
  • Developed Custom Logging so user can know when a row is inserted in custom logging table by every SSIS package that executes.

Environment: Informatica Power Center 10.1, Teradata 16, SQL Assistant for Teradata, Teradata Studio, Informatica IDQ 10.X, Informatica Cloud, Oracle 12c, SQL Server 20124, Jasper 6.1, Erwin - 4.0, TOAD 9.x, Oracle EBS, Shell Scripting, Kornshell, Business objects 5.1/6.5, PL/SQL, SSIS and Sun Solaris UNIX, Windows-XP.

We'd love your feedback!