We provide IT Staff Augmentation Services!

Sr. Informatica Bdm Developer/ Sr. Informatica Power Center Developer Resume

5.00/5 (Submit Your Rating)

Hanover, MD

ROFESSIONAL SUMMARY

  • 8+years of IT experience in Design, Development and Implementation of Data Warehouse and its applications.
  • Experience in Application Development Analysis, Requirement Analysis, Scoping, Developing, Debugging, Testing and Documentation of various phases in a project life cycle of Client/Server Applications.
  • Extensively worked on Data Extraction, Transformation and Loading data from various sources like SQL Server, Oracle9i/10g, DB2, Teradata, XMLs and Flat files.
  • 8+ years of industry experience in Data Warehousing with Informatica Power Center 8.X, 9.X and 10.X.
  • 1+ years of hands on experience on Informatica BDM 10.2.1.
  • 1 + years of implementation experience in Informatica Intelligent Cloud services (IICS)
  • Installed secure agent and created connections flat file, oracle, SQL server etc.
  • Created Mappings, Mapping tasks, task flows, synchronization tasks, replication task, masking task in IICS
  • Involved setting up Hadoop configuration (Hadoop cluster, Hive, Spark, Blaze connections) using Informatica BDM.
  • Created & deployed BDM applications and run the applications, workflows, mappings.
  • Experience in Data Modeling using Dimensional Data Modeling techniques such as Star Schema, Snowflake Schema.
  • Proficient in using Informatica Designer, Workflow manager, Workflow Monitor, Repository Manager to create, schedule and control workflows, tasks and sessions.
  • Translate complex functional and technical requirements into detailed design.
  • Extensive experience in performance tuning of existing workflows with pushdown optimization, session partitioning etc.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, Union, Rank, Normalizer, Update strategy and Router transformations for populating target table in efficient manner.
  • Extensive experience in using Debugger utility of the Informatica tool to check the errors in the mapping and made appropriate changes in the mappings to generate the required results.
  • Develop ETL audits and controls to ensure quality of data meets or exceeds defined standards and thresholds defined by the business customer.
  • Experience in identifying, researching, and resolving ETL issues and producing root cause analysis documentation.
  • Experience in Agile software development methodology and implementations.
  • Well versed in writing UNIX Shell Scripts for running Informatica workflows (pmcmd commands), file manipulations, housekeeping functions, FTP programs.
  • Well versed in developing the complex SQL queries, unions and multiple table joins and experience with views, procedures, triggers, functions & PL/SQL scripts.
  • Experience in using batch job scheduling tool - Autosys - CA Workload Automation Control Center.
  • Experienced in creating ETLs in Relational and Dimensional databases. Have good understanding of data warehouse best practices and methodologies (Kimball & Inman), logical, physical modeling.
  • Written with Complex Queries, stored procedures, Batch Scripts, Triggers, indexes and Functions using T-SQL for SQL Server 2012.
  • Created SSRS Reports: involving variety of features like Charts, Filters, Sub-Reports, Drill Down, Drill-Through, involved in Dashboard reporting etc.
  • Designed and created Report templates, bar graphs and pie charts based on the financial data using SSRS 2012.
  • Involved in testing of large warehouse projects. Took part in Unit, System, Regression, Performance, Reconciliation, Integration testing.
  • Expert in finding bottlenecks in the database and/or ETLs that impacts performance and resolving them proactively to meet applications service level agreements.
  • Expert in using Informatica debugger to analyze complex mappings to find defects
  • Worked in Global Delivery Model(Onsite/Offshore), Coordinated projects with multiple teams across different locations in Technology Lead role.
  • Have excellent written and verbal communication skills, and is experienced in working with senior level managers, business people and developers across multiple disciplines.

TECHNICAL SKILLS:

Data warehousing ETL: Informatica Power Center 10.x/ 9.x /8.x/7.x/, DW Designer, Mapping Designer, Work Flow Manager, Meta data reporter, Work Flow Monitor, Mapplet, Transformations, SSIS, Informatica BDM 10.2.1, Informatica MDM, IDQ

Data Modeling: Dimensional Data Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical and Logical Data Modeling, Entities, Attributes, Cardinality, ER Diagrams, ERWIN 4.5/4.0, DB Designer

Reporting & BI: SQL Server Reporting Services, Tableau, Power BI

Job Scheduling: Autosys, Control M, CA WorkStation, Cron jobs

Programming: SQL, PL/SQL, Transact SQL, Unix Shell Scripting, Python, HTML, C#, C++

System Design & Development: Requirements Gathering and Analysis, Data Analysis, ETL Design, Development and Testing UAT Implementation

Databases: Oracle 10g Confidential 9i Confidential 8i, MS SQL Server 2000/2005/2008 , Tera Data, Hive, Impala MS Access, DB2

Environment:: RHEL (Red Hat Enterprise Linux)4.0, UNIX (Sun Solaris 2.7/2.6, HP-UX 10.20/9.0, IBM AIX 4.3/4.2), Windows 10/7/2003/2000/ XP/98, Win NT, Linux, MS-Dos.

PROFESSIONAL EXPERIENCE:

Sr. Informatica BDM Developer/ Sr. Informatica Power Center Developer

Confidential, Hanover, MD

Responsibilities:

  • Involved setting up Hadoop configuration (Hadoop cluster, Hive, Spark, Blaze connections) using Informatica BDM.
  • Created& deployed BDM applications and run the applications, workflows, mappings.
  • Design & development of BDM mappings in Hive mode for large volumes extracting data from DWH to Data Lake
  • Involved in designing Logical/Physical Data Model for IDQ custom metadata.
  • Configure the Sessions & Workflows for recovery and with High Availability.
  • Developed BDM mappings using Informatica Developer and created HDFS files in Hadoop system.
  • Implemented SCD type2 mappings using BDE and load the data into Hadoop Hive tables using Push down mode.
  • Involved in designing Physical Data Model for Hadoop Hive tables
  • Wrote HiveQL queries validate HDFS files & Hive table data to make sure the data meet the requirements & Develop Hive, Impala tables and queries.
  • Extensively worked on code migration from development box to various environments, create & manage Informatica reference data.
  • Create complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic programs using Informatica Power center.
  • Used filter, sorter, expression, aggregator, joiner, router, Normalizer, Union, lookup etc. transformations convert the complex business logics into ETL code.
  • Design workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.

Environment: Informatica BDM 10.2.1, Power center 10.2.1, Hadoop 2.7.1, Hive 2.3.2, Oracle12c, Flat files, SQL, Omniture, SQL Developer, Windows XP, UNIX Shell Scripts, Cron jobs.

Sr. Informatica Developer

Confidential . Columbus, OH

Responsibilities:

  • Involve in all phases of SDLC from requirement gathering, design, development, testing, Production, user training and support for production environment.
  • Work with data architect to generate Automation opportunities in areas like Data Migration from Legacy system.
  • Work with subject matter experts and project team to identify, define, collate, document and communicate the data migration requirements.
  • Developed RTC Stories and docs by Translating Business Requirements to System Requirements for Agile global delivery model.
  • Partner with DBAs to transform logical data models into physical database designs while optimizing the performance and maintainability of the physical database.
  • Create complex mappings which involved Slowly Changing Dimensions, implementation of Business Logic programs using Informatica Power center.
  • Used filter, sorter, expression, aggregator, joiner, router, Normalizer, Union, lookup etc. transformations convert the complex business logics into ETL code.
  • Design workflows with many sessions with decision, assignment task, event wait, and event raise tasks, used Informatica scheduler to schedule jobs.
  • Practical experience utilizing tools to query such as Teradata SQL Assistant
  • Extensive working experience in Teradata utilities viz TPUMP BTEQ Fast Export Fast Load Multiload TPUMP etc.
  • Involved in performance analysis and SQL query tuning using EXPLAIN PLAN Collect Statistics Teradata Viewpoint.
  • Involved in automating daily tasks using various computer programming languages, such as SQL, PL/SQL, Unix shell scripting, in order to successfully complete the tasks without failures.
  • Optimizing performance tuning at source, target, mapping and session level.
  • Query tuning, Informatica partitioning, Redesign of some mappings to remove bottlenecks, caching warehouse and mart data before processing of actual source file data are some examples.
  • Involve in the performance tuning of the Informatica mappings and stored procedures and the SQL queries.
  • Participate in weekly status meetings, and conducting internal and external reviews as well as formal walk through among various teams and documenting the proceedings.
  • Developing and optimizing RSpec, FactoryGirl, and Sequel gems in RubyMine to unit test complex Informatica mappings.
  • Using Cucumber gem with object oriented programming in Ruby to create and run acceptance tests.
  • Updating and incorporating ruby script in Jenkins as a CI server to run regression testing suite.
  • Using RTC and RRC to keep track of and validate application code with working according to requirements.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the Informatica Mappings.

Environment: Informatica Power Center 10.1/9.6 (Informatica Server, Informatica Repository Server, Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Oracle 10g, Teradata, PL/SQL, Flat files, XML Files, ERwin 4.0, TOAD 12.8

ETL/ Informatica Developer

Confidential, Newton, MA

Responsibilities:

  • Worked with various business users for requirement gathering and analysis
  • Designed ETL mapping document to map source data elements to the target based in Star-Schema dimensional model
  • Involved in designing of star schema based data model with dimensions and facts
  • Designed and developed complex mappings using various transformations in Designer to extract the data from relational sources like Oracle, SQL Server, and non relational source like flat files to perform mappings based on company requirements and load into Oracle tables
  • Worked extensively on Informatica Designer, workflow manager to create, sessions, workflows and monitor the results and validate them according to the requirement
  • Worked on Informatica repository manager for creating users, groups, assigning Read, write, execute privileges for Users by assigning users to groups, creating folders and changing to share folders
  • Created shortcuts in individually assigned folders for the sources and Targets which are imported into a Centralized shared folder
  • Created complex SCD type 1 & type 2 mappings using lookup, Joiner, aggregator, Filter, Normalizer, Update Strategy and Router Transformations
  • Used different tasks (Session, Command, Decision, Timer, Email, Event-Raise, Event-Wait, Control) in the workflow
  • Performance Tuned Informatica Targets, Sources, mappings and sessions for large data files by increasing data cache size, sequence buffer length and target based commit interval

Environment: Informatica Power center 9.6, DB2 UDB v9.7, UNIX AIX, Control M, IBM Data Studio, Squirrel SQL Client, Win SCP

ETL Developer/Informatica Developer

Confidential, Hermitage, PA

Responsibilities:

  • Took part in Detail level design and coding of Informatica components that load source data into Data Integration Hub
  • Complex ETLs cleanse, enrich, validate and apply various rules on journal transactions before PeopleSoft processing. Then posted journal transactions loaded into Data Mart along with updated dimensions for Reporting
  • Working with PeopleSoft team to understand Interfaces and design ETLs accordingly. It includes Informatica Web Service Transformation to validate source fields against PeopleSoft combination edit fields
  • Took part in creating CA Autosys batch which controls the nightly cycles. It includes identifying and creating correct dependencies within and external to the system
  • Extensively used Error handling strategy to route the error records from source to error tables and reprocessing them after the data correction
  • Wrote Unix Scripts to send archival files (ftp) from one server to other server, archive jobs, checksum validations, duplicate file check, Header Row count, Credit debit validations
  • Performance tuning of Informatica components which is critical as the data volume goes above 1-2 millions of transactions every day. Query tuning, Informatica partitioning, Redesign of some mappings to remove bottlenecks, caching warehouse and mart data before processing of actual source file data are some examples
  • Supported Functional, Integration and User Acceptance testing with timely resolution of defects.
  • Migration of Informatica code from lower environments to higher environments using Repository manager
  • Deployment contact for Data Integration track. Created milestones for deploy events and coordinated to ensure smooth implementation.
  • First point of contact for all DI hub issues. Resolved production job failures, incidents related to reporting data issues.

Environment: - Informatica Power center 9.1, Teradata, DB2 UDB, UNIX, CA Autosys, HP Quality Center

ETL Developer / BI Developer,

Confidential

Responsibilities:

  • Took part in Detail level design and coding of Informatica components that load source data into Data Integration Hub
  • Complex ETLs cleanse, enrich, validate and apply various rules on journal transactions before PeopleSoft processing. Then posted journal transactions loaded into Data Mart along with updated dimensions for Reporting
  • Working with PeopleSoft team to understand Interfaces and design ETLs accordingly. It includes Informatica Web Service Transformation to validate source fields against PeopleSoft combination edit fields
  • Took part in creating CA Autosys batch which controls the nightly cycles. It includes identifying and creating correct dependencies within and external to the system
  • Extensively used Error handling strategy to route the error records from source to error tables and reprocessing them after the data correction
  • Wrote Unix Scripts to send archival files (ftp) from one server to other server, archive jobs, checksum validations, duplicate file check, Header Row count, Credit debit validations
  • Performance tuning of Informatica components which is critical as the data volume goes above 5-8 million of transactions every day.
  • Query tuning, Informatica partitioning, Redesign of some mappings to remove bottlenecks, caching warehouse and mart data before processing of actual source file data are some examples.
  • First point of contact for all DI hub issues. Resolved production job failures, incidents related to reporting data issues.
  • Involved in extracting data for cleansing activity from SQL source then transformed and loaded into Data Warehouse Targets using SSIS.
  • Extensively working in Data Extraction Transformation and loading from source to target system using SQL Server 2008 Integration Services
  • Importing source/target tables from the respective data base by using execute package task using control flow tasks in SQL Server Integration Services.
  • Created event handlers per the package using event handler tab.
  • Used transformations like Derived Column, Conditional Split, Aggregate, Lookup, sort, Data Conversion and Script Task to load data into Data Warehouse.
  • Used stored procedures to load the data into data warehouse.
  • Configuration, Logging and Auditing.
  • Involved in using Sub Queries, Joins, Functions and Set Operations in MS SQL SERVER 2008 Oracle 9i

ETL Developer,

Confidential

Responsibilities:

  • Involved in full life cycle design and development of Data warehouse.
  • Interacted with business representatives for requirement analysis and to define business and functional specifications.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Created Reusable code like reusable sessions, and transformations to plug into the existing mappings.
  • Performed slowly changing dimension (SCD) Type1, Type2 and Type3 mappings.
  • Actively involved in gathering requirements and acquiring application knowledge from the Business.
  • Extracted data from different sources like Oracle, flat files, MQ, Web services.
  • Developed ETL routines using Informatica Power Center and created mappings involving transformations like Lookup, Aggregator, Ranking, Expressions, Mapplets, connected and unconnected stored procedures, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers.
  • Extensively used Mapping Variables, Mapping Parameters to execute complex business logic.
  • Design and development of complex ETL mappings making use of Connected/Unconnected Lookups, Normalizer, Stored Procedures transformations.
  • Development of PL/SQL Stored Procedures to implement complex business logic and integrate them with the ETL process as pre/post Stored Procedures.
  • Proficient in using Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
  • Used debugger in debugging some critical mapping by setting breakpoints and trouble shot the issues by checking sessions and workflow logs.
  • Extensively used transformations like router, aggregator, lookup, source qualifier, joiner, expression, aggregator and sequence generator, Normalizer, XML Generator, SQL, SP transformation in extracting data in compliance with the business logic developed.
  • Involved in massive data cleansing prior to data staging.
  • Involved in performance and tuning the ETL processes.
  • Created new database objects like Procedures, Functions, Packages, Triggers, Indexes and Views in Development and Production environments of ODS and DWH.
  • Debugging invalid mappings using break points, testing of stored procedures and functions, testing of Informatica sessions, batches and the target Data.
  • Developed shell scripts, PL/SQL procedures, for creating/dropping of table, indexes of performance for pre and post session management.
  • Used SSIS to create ETL packages to Validate, Extract, Transform and Load data to Data Warehouse and Data Mart Databases.
  • Designed a new schema for the Data Mart databases.
  • Generated various reports like Cascade type reports, Tabular reports, and Drill down reports.
  • Created Store Procedures, Functions and Triggers for retrieval and update of data in the database.

We'd love your feedback!