We provide IT Staff Augmentation Services!

Sr. Informatica Power Center Developer Resume

5.00/5 (Submit Your Rating)

SUMMARY

  • 9 years of IT experience in Design, Development and Implementation of Data Warehouse and its applications.
  • Experience in Application Development Analysis, Requirement Analysis, Scoping, Developing, Debugging, Testing and Documentation of various phases in a project life cycle of Client/Server Applications.
  • Extensively worked on Data Extraction, Transformation and Loading data from various sources like SQL Server, Oracle9i/10g, DB2, Teradata, Salesforce, XMLs and Flat files.
  • 8+ years of industry experience in Data Warehousing with Informatica Power Center 8.X, 9.X and 10.X.
  • 1+ years of hands on experience on Informatica BDM 10.2.1.
  • 1 + years of implementation experience in Informatica Intelligent Cloud services (IICS).
  • Proficient in using Replication Task, Masking Task, Mass ingestion Task, Task flows in IICS.
  • Created models with Intelligent structure model using excel, csv, pdf and txt files.
  • Installed secure agent and created connections flat file, oracle, SQL server etc.
  • Created Mappings, Mapping tasks, task flows, synchronization tasks, replication task, masking task in IICS
  • Involved setting up Hadoop configuration (Hadoop cluster, Hive, Spark, Blaze connections) using Informatica BDM.
  • Created & deployed BDM applications and run the applications, workflows, mappings.
  • Experience in Data Modeling using Dimensional Data Modeling techniques such as Star Schema, Snowflake Schema.
  • Proficient in using Informatica Designer, Workflow manager, Workflow Monitor, Repository Manager to create, schedule and control workflows, tasks and sessions.
  • Translate complex functional and technical requirements into detailed design.
  • Extensive experience in performance tuning of existing workflows with pushdown optimization, session partitioning etc.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Extensive experience in using Debugger utility of the Informatica tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results.
  • Develop ETL audits and controls to ensure quality of data meets or exceeds defined standards and thresholds defined by the business customer.
  • Experience in identifying, researching, and resolving ETL issues and producing root cause analysis documentation.
  • Experience in Agile software development methodology and implementations.
  • Well versed in writing UNIX Shell Scripts for running Informatica workflows (pmcmd commands), file manipulations, housekeeping functions, FTP programs.
  • Extensive working experience in Teradata utilities viz TPUMP, BTEQ, Fast Export, Fast Load, Multiload etc.
  • Well versed in developing the complex SQL queries, unions and multiple table joins and experience with views, procedures, triggers, functions & PL/SQL scripts.
  • Experience in using batch job scheduling tool - Autosys - CA Workload Automation Control Center.
  • Experienced in creating ETLs in Relational and Dimensional databases. Have good understanding of data warehouse best practices and methodologies (Kimball & Inman), logical, physical modeling.
  • Written with Complex Queries, stored procedures, Batch Scripts, Triggers, indexes and Functions using T-SQL for SQL Server 2012.
  • Created SSRS Reports: involving variety of features like Charts, Filters, Sub-Reports, Drill Down, Drill-Through, involved in Dashboard reporting etc.
  • Designed and created Report templates, bar graphs and pie charts based on the financial data using SSRS 2012.
  • Involved in testing of large warehouse projects. Took part in Unit, System, Regression, Performance, Reconciliation, Integration testing.
  • Expert in finding bottlenecks in the database and/or ETLs that impacts performance and resolving them proactively to meet applications service level agreements.
  • Expert in using Informatica debugger to analyze complex mappings to find defects
  • Worked in Global Delivery Model(Onsite/Offshore), Coordinated projects with multiple teams across different locations in Technology Lead role.
  • Have excellent written and verbal communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.

TECHNICAL SKILLS

Data warehousing ETL: Informatica Power Center 10.x/ 9.x /8.x/7.x/, DW Designer, Mapping Designer, Work Flow Manager, Meta data reporter, Work Flow Monitor, Mapplet, Transformations, SSIS, Informatica BDM 10.2.1, Informatica Intelligent Cloud Services (IICS)

Data Modeling: Dimensional Data Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical and Logical Data Modeling, Entities, Attributes, Cardinality, ER Diagrams, ERWIN 4.5/4.0, DB Designer

Reporting & BI: SQL Server Reporting Services, Tableau, Power BI

Job Scheduling: Autosys, Control M, CA WorkStation, Cron jobs

Programming: SQL, PL/SQL, Transact SQL, Unix Shell Scripting, Python, HTML, C#, C++

System Design & Development: Requirements Gathering and Analysis, Data Analysis, ETL Design, Development and Testing UAT Implementation

Databases: Oracle 10g,9i,8i, MS SQL Server 2000/2005/2008 , Tera Data, Hive, Impala MS Access, DB2

Environment: RHEL (Red Hat Enterprise Linux)4.0, UNIX (Sun Solaris 2.7/2.6, HP-UX 10.20/9.0, IBM AIX 4.3/4.2), Windows 10/7/2003/2000/ XP/98, Win NT, Linux, MS-Dos.

PROFESSIONAL EXPERIENCE

Sr. Informatica Power Center Developer

Confidential

Responsibilities:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Work with data architect to generate Automation opportunities in areas like Data Migration from Legacy system.
  • Work with subject matter experts and project team to identify, define, collate, document and communicate the data migration requirements.
  • Partner with DBAs to transform logical data models into physical database designs while optimizing the performance and maintainability of the physical database.
  • Create complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic programs using Informatica Power center.
  • Used filter, sorter, expression, aggregator, joiner, router, Normalizer, Union, lookup etc. transformations convert the complex business logics into ETL code.
  • Experience in iPaaS integration technologies and complete implementation experience of Informatica Cloud Application and Data Integration with Salesforce platform or any other major cloud-based platforms.
  • Experience with Oracle SQL and PL/SQL programming and used Database utility programs like TOAD and SQL Navigator.
  • Experience in Oracle SQL and PL/SQL including all database objects: Stored procedures, Stored functions, Packages, Triggers, cursors, REF cursors, Parameterized cursors, Views, Materialized Views
  • Design workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.

Environment: Power center 10.2.1, Oracle12.5, Flat files, SQL, SQL Developer, Salesforce, UNIX Shell Scripts, Informatica Scheduler, Urban Code, Jenkins, Source Tree.

Sr. Informatica BDM Developer/ Sr. Informatica Power Center Developer

Confidential

Responsibilities:

  • Involved setting up Hadoop configuration (Hadoop cluster, Hive, Spark, Blaze connections) using Informatica BDM.
  • Created& deployed BDM applications and run the applications, workflows, mappings.
  • Design & development of BDM mappings in Hive mode for large volumes extracting data from DWH to Data Lake
  • Involved in designing Logical/Physical Data Model for IDQ custom metadata.
  • Configure the Sessions & Workflows for recovery and with High Availability.
  • Developed BDM mappings using Informatica Developer and created HDFS files in Hadoop system.
  • Implemented SCD type2 mappings using BDE and load the data into Hadoop Hive tables using Push down mode.
  • Involved in designing Physical Data Model for Hadoop Hive tables
  • Wrote HiveQL queries validate HDFS files & Hive table data to make sure the data meet the requirements & Develop Hive, Impala tables and queries.
  • Extensively worked on code migration from development box to various environments, create & manage Informatica reference data.
  • Create complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic programs using Informatica Power center.
  • Used filter, sorter, expression, aggregator, joiner, router, Normalizer, Union, lookup etc. transformations convert the complex business logics into ETL code.
  • Design workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.

Environment: Informatica BDM 10.2.1, Power center 10.2.1, Hadoop 2.7.1, Hive 2.3.2, Oracle12c, Flat files, SQL, Omniture, SQL Developer, Windows XP, UNIX Shell Scripts, Cron jobs.

We'd love your feedback!