We provide IT Staff Augmentation Services!

Informatica Consultant Resume

2.00/5 (Submit Your Rating)

SUMMARY

  • 8 years of IT experience in the Analysis, Design, Development, Testing, and Implementation of business application systems for Automobile, Health Care, Transportation and Logistics, Media & Entertainment, Banking & Financial Sectors.
  • Strong experience in implementing various solutions using Informatica products like Informatica Intelligent Cloud Services (IICS), Informatica Power Center 10.2/9.6/8.6.1 , Informatica Data Quality (10.2), Informatica Analyst, Informatica Power exchange, Informatica MDM.
  • Experience in Data Extraction, creating Column Profiling, Rule Profiling, Mid - stream Profiling, Join analysis profiling, Data Domain, Domain discovery, Scorecards, Data linage Data Cleansing, Data Standardization, Match & Merge process and Data De-Duplication using Informatica Data Quality 10.2 HF2.
  • Experienced in building the Data Integration/Application Integration jobs using IICS services that uses various connections like Salesforce, Oracle, Amazon S3,Amazon Redshift, Snowflake cloud data warehouse.
  • Experience in implementing Push Down optimization to improve load performance on Snowflake cloud data warehouse.
  • Experience with Application Integration, Data Integration, Data Synchronization, Data Replication, Mass ingestion, administration, Monitor etc. services in IICS .
  • Strong experience using Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Experience in using Informatica Address Doctor for global address verification across an organization and integrated IDQ components into Informatica Power Center.
  • Good experience on rule specification, Mapping specifications, Virtual data objects, mapplets, rules and reference data management.
  • Experience with deploying the mapplets as service and using them in downstream process.
  • Extensively worked with complex mappings using various transformations like Expression, Lookup, Filter, Router, Union, Aggregator, Joiner, Update Strategy, Sequence generator, Java Transformation, Labeller, Match, Merge, Decision, Exception, Key Generator, Standardizer, Case Converter, Consolidation, Parser, Address validator and Reusable Transformations, User Defined Functions etc.
  • Experience in Mater Data Management (MDM) Multi Domain Edition HUB Console Configurations, Informatica Data Director (IDD) Application creations.
  • Extensive experience in Master Data Management (MDM) HUB Console Configurations such as Staging Process Configurations, Landing Process Configurations, Match and Merge Process, Cleansing Functions, User Exits.
  • Experience in creating queries, packages, custom cleansing functions and reusable functions.
  • Experience in Dimensional Modeling in creating various data marts like Star and Snowflake Schema, Identifying Facts and Dimensions (SCD Type I, SCD Type II), Physical and logical data modeling using Erwin and ER-Studio.
  • Strong experience in Teradata utilities BTEQ, TPUMP, FLOAD, MLOAD, TPT, MACROS to export and load data to/from Flat files.
  • Good experience in implementing CDC (Change data Capture) or Delta extraction from Source Systems by identifying the critical date fields and captured all the metadata in ETL Control tables.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using Oracle, SQL Server, Netezza, Teradata, T-SQL, and Oracle PL/SQL.
  • Experience in ingesting the data into Data Lake Amazon S3 buckets and worked with cloud data warehouse Amazon Redshift and Snowflake for analytical needs.
  • Skilled in Unix Shell Scripting and experience on different UNIX platforms.
  • Strong experience in developing the Test plan, test scripts to perform data validation and Involved in System and Integration Testing.
  • Worked with different scheduling tools like Control-M, Zeke, Redwood, Autosys.
  • Good working experience in Agile and Waterfall methodologies.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.2,9.6,8.6, Informatica Data Quality (IDQ), Informatica Power Exchange (PWX), Informatica Intelligent Cloud Services (IICS), SSIS (SQL Server Integration Services), DataStage, Oracle Golden Gate CDC, SQL Server Data Tools, Visual Studio.

Reporting Tools: Solver BI360, Cognos, MicroStrategy

Data Modeling Tool: Dimensional Data Modeling, Star/Snowflake Schema Modeling Erwin 8/7, Microsoft Visio.

Databases: Oracle 10g/9i/8i, Netezza, DB2, Salesforce, Teradata (Fast-Load, Multi-Load, BTEQ), MS SQL Server, MS Access.

Languages: SQL, PL/SQL, T-SQL, UNIX Shell Scripting

Cloud Technologies: Amazon Redshift, Amazon S3, Snowflake cloud data warehouse

Operating Systems: UNIX, Windows.

Scheduling Tools: Informatica Scheduler, Autosys, RedWood, Control-M

Others: Rally, HP Quality Center, Remedy User, Jira.

PROFESSIONAL EXPERIENCE

Confidential

Informatica Consultant

Responsibilities:

  • Involved in gathering business requirements and attended technical review meetings to understand the data warehouse model.
  • Developed Technical Specifications of the ETL process flow.
  • Develop ETL jobs using Informatica Intelligent Cloud Services (IICS), Informatica Power Center (10.2), Informatica Power exchange .
  • Rewrite Informatica Power Center into IICS jobs .
  • Created reusable mappings in IICS.
  • Have used Application Integration, Data Integration, Data Synchronization, Mass ingestion, administration, Monitor etc. services in IICS .
  • Constructed IICS mappings, mapping tasks, process to extract data from various sources like Salesforce, Oracles Server and loaded into Teradata and Amazon Redshift datawarehouse.
  • Created various types of Connectors using IICS Administration, Application Integration Service connectors.
  • Analyze the existing Power Center workflows and re write the ETL code into IICS assets like mappings, mapping tasks, linear task flows, advanced task flows etc.
  • Implemented Data Synchronization, Data Replication, Mass Ingestion,mapping tasks in IICS.
  • Build Power exchange data maps to process real time (real time CDC/Batch CDC) data from Db2 source tables into Staging tables.
  • Implemented audit process to ensure to data accuracy between Source and EDW systems.
  • Created Reusable mapping templates and reuse the same logic in IICS mapping tasks.
  • Have implemented Informatica Intelligent Cloud Services to read/write operations on various data sources like Oracle, SQL Server, Salesforce, Teradata, Amazon S3, Mongo Db,Snowflake etc.
  • Created ETL jobs using Power Exchange Data maps to perform real time CDC data from Mainframe Db2 tables into Staging tables.
  • Implemented Push down optimization techniques.
  • Implemented TPT connections improve Teradata load performance to process huge data volumes.
  • Responsible for troubleshooting ETL job failures.
  • Managed ETL control tables to handle batch process incremental/CDC data.
  • Work on code changes of existing bugs includes providing Root Cause analysis, Data fixes and the enhancement requirements.
  • Develop IICS jobs to extract data from various sources data into Data lake layer like AWS S3 buckets.
  • Developed ETLs to load into cloud data warehouse Snowflake.
  • Worked with SOAP and REST API calls and processed semi structured data.
  • Involved in data modeling like created database tables, views.
  • Created UNIX and Power Shell scripts for automating repetitive tasks like FTP, Archival, file validation steps etc.
  • Performed unit, system, parallel testing, and assisted users during UAT phases.
  • Troubleshoot the long running process by analyzing the logs and implemented the appropriate tuning techniques.
  • Worked with Cognos team to understand the reports and fix the data issues
  • Extensively used workflow variables, mapping parameters and mapping variables.
  • Performed Error handling, audit validations and recovery techniques.
  • Tuned the Informatica mappings, sessions for optimal load performance.
  • Implemented Session partitions to achieve parallel processing mechanism to improve ETL performance.

Confidential

Sr.ETL Informatica developer

Responsibilities:

  • Involved in gathering business requirements and attended technical review meetings to understand the data warehouse model.
  • Developed Technical Specifications of the ETL process flow.
  • Develop and implement ETL utilizing Informatica PowerCenter.
  • Establish reusable, auditable, and high quality ETL design patterns to be leveraged across all data warehouse layers.
  • Follow team-based development versioning and control, check in/checkout.
  • Adhere to ETL design patterns, as created by the ETL Architect, for development, auditing, testing, and data quality.
  • Design and build Dimensional and Physical Data Model with a clear understanding of best practices.
  • Worked on the data profiling and creation of scorecards to analyze the data with respective different measures and Automating the profile and scorecard run with UNIX scripting
  • Created mappings, workflows, mapping specifications, rule specification, mapplets, rules, reference data, LDO, CDO and applications in IDQ.
  • Good experience with the bad record exception /de-dup exception handling with the Human task in IDQ.
  • Extracted data from various sources and loaded into EDWARD system.
  • Created Oracle procedures, functions, packages, triggers, indexes etc.
  • Involved in ETL Design & Development, Testing, Maintenance activities.
  • Involved in Enhancing existing Production informatica objects for change or additional requirements and pushing it back to production after successful QA testing.
  • Implemented Teradata Utilities BTEQ, M-Load, F-Load, TPT and F-Export in combination with Informatica for better Load into Teradata Warehouse.
  • Created quality rules, development and implementation patterns with cleanse, parse, standardization, validation, scorecard transformations.
  • Created various Shell Scripts for scheduling various data cleansing jobs and loading process. Maintained the batch processes using Unix Shell Scripts.
  • Implemented complex business rules in Informatica Power Center by creating re-usable transformations, and Mapplets.
  • Defines and captures metadata and rules associated with ETL processes
  • Developed audit queries and automated the same queries to ensure the data feeds are successful on daily basis.

We'd love your feedback!