We provide IT Staff Augmentation Services!

Informatica Bdm Engineer/administration Resume

SUMMARY:

  • 8 years of Data Warehousing experience in development, Installing, Upgrade, Administer, Analyze, Design and implementation of Data warehousing ETL solutions using Informatica Power Center, Metadata Manager, Data Validation Option, BDM, BDQ, EDC, Data Lake
  • Have configured the Informatica BDM/BDQ/EIC/IDL 10.1/10.2.1 and integrated with the Kerberos enabled Cloudera and Horton Hadoop Frameworks.
  • Applied incremental Emergency bug fixes to fix any product issues and follow up with Informatica support for any issues.
  • Familiar with the Rest API Calls to run the scanner or update the Business Terms on the EDC.
  • Maintain Security administration Natively or via Active Directory - support with access privileges for different levels users/groups/roles - built in or customized
  • Familiar with Kerberos enabled Informatica domain for the SSO option.
  • Worked on OS Profiles to help multiple development teams working on a single shared file system and use the impersonation property on the IDS against Hadoop systems since the access on the Hadoop cluster was based on the Apache Ranger Policies.
  • Experienced in processing Healthcare - HL7 data files using the Informatica Data Transformation Libraries and load them into Hadoop.
  • Created customized BDM mappings/workflows for incremental using Informatica Developer and deploy them as part of an application on Data Integration service for the native execution or push down using the Blaze or Spark execution engines.
  • Assist developers in debugging since now the logs are scattered on BDM between the Native servers and on the Yarn - Spark and Blaze have different set of logs in different directories configured which is a pre requisite for Informatica Big Data Products.
  • As part of the change control, help developers in updating the service properties and migrating the code between the environments.
  • Have worked with the parameter sets/files at mapping and workflow level.
  • Familiar with using Sqoop arguments required to source data from any traditional database systems into Hadoop.
  • Experienced in writing Native Hive QL’s and Compare the load performances with BDM jobs.
  • Create different scanners Hive/Oracle/PowerCenter/Platform to load the data assets into a single place repository - EDC.
  • Scanners have the option to profile the data at column level or perform a data discovery against the data domain groups or both.
  • Load the Business Terms of the organizations Data Dictionary into EDC and assign them to the data assets and create lineages for the data flow.
  • Data Scientists can use the IDL to do data previews on any table and can create their own projects where they can prepare or virtualize the data assets and can do data blending with multiple assets/apply rules/merge assets and publish the results back to the Lake.
  • Upload any Delimited files directly into the data lake from the UI.
  • Have configured Power center to load the data to Hive directly without Informatica BDM for less resource intensive jobs.
  • Have configured the DVO for the testing Informatica Upgrades where we compare the data loaded in 2 different tables using the old and the new Informatica Versions.
  • Work with business analysts to create Proof of Concept DVO test cases based on customer’s needs surrounding Unit Testing, QA testing and most importantly, production reconciliation.
  • Also used the DVO for unit testing or regression testing where you define a set of Test cases and can be reused anytime later.
  • Understand the Integration of third party schedulers like Autosys which will use the same command line options to call any Informatica jobs. Wrote a common shell which will take the Domain, repository, Integration service, user credentials, Folder name and workflow as parameters to run the jobs via Informatica command line pmcmd.
  • Used Autosys to schedule Informatica server maintenance which included the backup jobs for Domain and power center repositories.
  • Worked on the different applications such as Power Center Repository/Integration/Metadata Manager/Data Integration/Model Repository services running on the Domain.
  • Handled the upgrade requests from the 9.1 versions till 10.1/10.2.
  • Writing shells to automate the Informatica weekly domain and repository back up process for the restoration in case of crashes.
  • Understand the different command line options available such as pmcmd, pmrep, infacmd which will help automate the daily data loads. Used all the different ways of migrating the code between the environment including the deployment groups.
  • Working experience using Informatica Power Center to create and schedule workflows which will reduce manual intervention.
  • Understand the code versioning option comes with the team-based license which gives the ability to have multiple versions of the same code.
  • Automate the weekly repository backups, archiving logs and remove unnecessary caches to free up the space on the file system.
  • Worked on automated regression testing with each release into production which involved in creating some UNIX Scripts and coding some Informatica Mappings to compare the data Quality and release it to the end users.
  • Working on Audit Balance and controlling the jobs to make sure of not dropping transactions during our ETL using automated shell scripts which will control sending notifications to the support team.
  • Data modeling knowledge using Dimensional Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables.
  • Designed and developed Informatica Mappings for sales force. Built Informatica mappings to migrate data from Sql server, Oracle and Flat files into sales force Sandbox & production using the Informatica Power Center with sales force power connect adaptor.
  • Experience in handling with large deployments.
  • Successfully performance tuned and maintained ETL applications and SQL databases with significant reduction in the process run time and standardization of the process in production.
  • Creating UNIX Shell scripting and supported implementation of automated ETL solutions which included scheduling of jobs with Informatica PMCMD Commands.
  • Consistently provided metrics (effort estimation and tracking on ETL tasks) and status reports for the senior management.
  • Used Informatica Power Center to build new agent data warehouse by following slowly changing dimensions (SCD) type II process.

TECHNICAL SKILLS:

ETL/Scheduling: Informatica PC 9.1/ 9.6.1/10.1/10.2, DQ 9.6.1/10.1/10.2 Databases: Oracle 11/12C, SQL Server 2008/2012, Hive

DB Tools: Toad, SQL Developer, SQL* plus, SQL* Loader, Import, Export

OS: Windows 2008/12, Red hat Linux

PROFESSIONAL EXPERIENCE:

Informatica BDM Engineer/Administration

Confidential

Responsibilities:

  • Familiar with the Install/Upgrade process for the BDM/EDC and the IDL and their association with the different application services on the domain.
  • Familiar with split domain functionality for BDM and EDC and use the same Blaze engine on the Cluster.
  • Familiar with the Rest API Calls to run the scanner or update the Business Terms on the EDC.
  • Familiar with configurations needed to route users to different Queues on the cluster.
  • Processing Healthcare - HL7 Files using the Informatica Data Transformation Libraries and load them into Hadoop.
  • Automated the process using shell scripts to get the files from the remote server, read each and extract individual patient data and create Data processor to make sure not to miss any data segment.
  • Parse Hierarchal XML/Json data from the Data Transformation and load them into Hive.
  • Create Hive external tables on the HDFS Directories and its sub dir to read the data from the Hue -UI.
  • Since we were dealing with the health care data, run Individual/Enterprise data profiles on all the databases for the Data Discovery as part of the Data Governance to set a process control in place for the access within the groups across the organization.
  • Use the default data domains groups (IDQ/BDQ Product Content) and create any customized rules to make sure all the scenarios are covered and follow up with data stewards to update the curation status based on which the access levels will be decided for the groups.
  • Create different scanners Hive/Oracle/PowerCenter/Platform to load the data assets into a single place repository - EDC.
  • Scanners have the option to profile the data at column level or perform a data discovery against the data domain groups or both.
  • Load the business terms of the organizations data dictionary into EDC and assign them to the data assets and create lineages for the data flow.
  • Use the IDL UI for the analysis and merge with the other assets and publish the results back to the Lake.
  • Created customized BDM mappings/workflows for incremental using Informatica Developer and deploy them as part of an application on Data Integration service for the native execution or push down using the Blaze or Spark execution engines.
  • Assist developers in debugging and migrating the code between the environments.
  • Have worked with the parameter sets/files at mapping and workflow level.
  • Familiar with using Sqoop arguments required to source data from any traditional database systems into Hadoop.
  • Experienced in writing Native Hive QL’s and Compare the load performances with BDM jobs.
  • Upload any delimited files directly into the data lake from the UI
  • Tune and redesign ETL Code for performance optimization as and when required.
  • Experienced in performing capacity and resource planning / management.
  • Experience in building and maintaining DEV, QA and PROD and Disaster Recovery environments.
  • Experience in troubleshooting issues and quickly resolve them in an efficient manner that minimizes downtime and follow up with Informatica support

Environment: Informatica - BDM 10.2.1, EDC 10.2.1, IDL 10.2.1

Informatica Developer/Administration

Confidential

Responsibilities:

  • Configured Informatica BDM 10.1 by integrating with the Kerberos enabled HDP 2.5 and
  • created mappings to read data from a flat file/Oracle/SQL Server/Teradata and write it to the
  • HDFS/Hive using JDBC and Sqoop.
  • Familiar to configure client PC’s to import metadata from Kerberos enabled HDP 2.5 using the local Principal names and the keytab files.
  • Primary BDM use case was to offload some of the oracle licenses onto Hive and move some of the complex ETL Processing on to the Cluster, for which dynamic mappings were used to migrate data from Oracle to Hive using parameters to execute the mapping for different tables through a shell which will execute the mapping from command line.
  • Configured the sub versioning feature available SVN with version 10 and above for the Model repository services to enable the code versioning
  • Used the PC reuse utility to understand the BDM Compatibility to reuse some of the PC ETL Code onto the Informatica Developer.
  • Convert the Power Center code using the informatica developer to BDM Mappings.
  • Understand the different transformations supported by the BDM to run on the cluster before making any ETL Low level documents for the developers.
  • Analysis of our source data systems using the IDQ/BDQ basic profiling features using the Informatica developer and the analyst tools before making any ETL Designs.
  • Familiar with creating IDQ/BDQ rules and use them within the mappings.
  • We can run the Profiles/mappings - workflows via applications deployed onto the DIS in Native or on Hadoop pushdown modes using the Blaze/Spark.
  • Used the parameter sets/Parameters against the Application/workflows/mappings to make sure to reuse across environments.
  • Created the JDBC connections with the Sqoop arguments needed to move the data from traditional databases onto Hadoop.
  • Writing shell scripts to schedule the Informatica domain and the repository backups on a weekly basis.
  • Working on setting up the Data replication server on a windows 08 system getting the real-time data from SQL Server onto oracle 11g system.
  • Responsible for maintaining the Incremental workflow runtimes to match the business needs.
  • Worked on our Data Warehousing Project which involves integrating data from various source systems into data Warehouse and build custom interfaces for various reporting systems
  • Tune and redesign systems for performance optimization as and when required.
  • Provide 24/7 production support after the implementation of the project.
  • Work with data loading and Reporting functions to support day to day activities.
  • An Informatica subject matter expert with extensive experience in installation, configuration, backup and recovery of Informatica modules and Skilled in Oracle databases (10g R2 and 11g) and Data Warehousing concepts.
  • Review database performance metrics and proactively address performance concerns.
  • Experienced in performing capacity and resource planning / management.
  • Experience in building and maintaining DEV, QA and PROD and Disaster Recovery environments.
  • Experience in troubleshooting issues and quickly resolve them in an efficient manner that minimizes downtime.

Environment: Informatica Power Center 9.5.1/9.6.1/10.2, BDM, Informatica data replication 9.5, Oracle 11g, Red Hat Linux 5.8, MS Sql Server 2008.

Informatica Developer/Admin

Confidential

Responsibilities:

  • Worked on Migrating and upgrading the Informatica PC 9.5.1 application on windows to 10.2 on Red hat Linux which involved bringing up a parallel environment and restore the domain from the old server onto new one and migrate the repositories.
  • Then Upgrade the 9.5.1 environment to 10.2 which involved upgrading the 9.5.1 to 9.6.1 to 10.2. This was to make use of the DSN supporting the SQL Server databases on Windows as the PC repositories starting Version 10.x.
  • Worked on Data Warehousing Projects which involves integrating data from various source systems into data Warehouse and build custom interfaces for analytical processing.
  • Experience in building and maintaining DEV, QA and PROD and Disaster Recovery environments.
  • Experience in troubleshooting issues and quickly resolve them in an efficient manner that minimizes downtime.
  • Migrate repositories between the environments and upgrade.
  • Schedule weekly repository backups via command line using shells.
  • Weekly maintenance to clear the temp logs and restart the services if needed after any system patches.
  • Keep an eye on the server logs for any database connectivity errors and follow up with DBA’s for any performance issues.
  • Understand the versioning concept with Power Center.
  • Help developers in migrating the code between the environments.
  • Analysis of our source data systems using the IDQ basic profiling features using the Informatica developer and the analyst tools before making any ETL Designs.
  • Familiar with creating IDQ rules and use them within the mappings.
  • Run the Profiles/mappings - workflows via applications deployed onto the DIS
  • Used the parameter sets/Parameters against the Application/workflows/mappings to make sure to reuse across environments.
  • Create the Native/ODBC/JDBC connections on Power Center.
  • Writing shell scripts to schedule the Informatica domain and the repository backups on a weekly basis.
  • Responsible for maintaining the Incremental workflow runtimes to match the business SLA’s.
  • Tune the ETL code if necessary to make sure the load on the servers is minimal.
  • Tune and redesign systems for performance optimization as and when required.
  • Provide 24/7 production support after the implementation of the project.
  • Work with data loading and Reporting functions to support day to day activities.
  • An Informatica subject matter expert with extensive experience in installation, configuration, backup and recovery of Informatica modules and Skilled in Oracle databases (10g R2 and 11g) and Data Warehousing concepts.
  • Review database performance metrics and proactively address performance concerns.

Environment: Informatica Power Center 9.1, Oracle 11g, MS SQL 2012

Informatica Developer

Confidential

Responsibilities:

  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing particular logic and transformation as per the requirement and creating mappings and loading the data into BID.
  • Extracted the data from Flat files, Oracle, DB2 and load the data into flat files, Oracle and DB2 Using Informatica Power center
  • Based on the business requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Reviewed data models using Erwin tool to find out data model dependencies.
  • Designing and developing ETL solutions in Informatica Power Center
  • Designing ETL process and creation of ETL design and system design documents.
  • Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
  • Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files from web servers and hosted files.
  • D eveloped Informatica mappings, enabling the extract, transform and loading large volumes of data into target tables.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in each load window.
  • Effectively used all the kinds of data sources to process the data and finally creating load ready files (LRF) as out bound files which are inputs to other downstream applications.
  • Worked extensively on different types of transformations like Source qualifier, expression, Aggregator, Router, filter, update strategy, Connected and Un-connected lookup, sorter, Normalizer, and sequence generator.
  • Created Workflows, Worklets and Tasks to schedule the loads at required frequency using Informatica scheduling tool. Created control files to handle job dependencies.
  • Created and Customized Dimensional Hierarchy, Logical columns, Aggregate columns, Level
  • based Measures and Aggregate navigations using Business Objects.
  • Assisted in performance testing, data quality assessment, support & product deployments.
  • Worked on Parameterize of all variables, connections at all levels in UNIX.
  • Involved in jobs scheduling and production deployment activities creation of the deployment guide migration of the code to production, monitoring and production support in a 24/7 environment.

Environment: Informatica Power Center 9.1/9.6, Oracle 11g, IBM DB2 IBM - AIX 6.1

Hire Now