We provide IT Staff Augmentation Services!

Iics/ Idq Lead Developer Resume

Atlanta, GA

PROFESSIONAL SUMMARY:

  • 11+ years of experience in design, development and Integration talents on Informatica Data Quality, Informatica cloud Intelligent services (IICS), BDM, Cloud services in Insurance, Banking, Retail and HealthCare Industries.
  • 8+ years of experience on Development/ Lead/ Administration with Informatica PowerCenter, Data Quality, BDM and 3+ years on Informatica Cloud Intelligent services (IICS) on AWS, Azure cloud services and cloud rational databases.
  • Hands on IICS components, Data Integration, application Integration, monitor and administration; Data Profile, Analysis, data cleansing, Address Validation, fuzzy matching/ Merging, data conversion, exception handling on Informatica Data Quality 10.1 (IDQ) and also Extraction, Transformation and Loading (ETL) data from various sources into Data Warehouses using Informatica PowerCenter.
  • Expertise in data processing semi - structured data (CSV, Parquet, XML and JSON) from S3 bucket/ Blob storage build mapping/ mapplets using Cloud Data Integration services.
  • Experience in real-time Integration process that interact with REST API, web services, SOAP calls and Interface with various applications on cloud.
  • Experienced in using Informatica eco system, implemented batch mode and real-time data from multiple source system such as web services and loaded data in S3/RDS.
  • Migrated SQL Server Database into multi cluster Snowflake environment, created data sharing multiple applications and snowflake virtual warehouses based on data volume/ Jobs.
  • Legacy Informatica logical code migrated into Informatica Intelligent cloud services (IICS) using Snowflake DB and cloud services.
  • Created Data Governance business rules and integrated front-end application of IDQ WSDL real-time web services.
  • Informatica Data Quality (IDQ) profiling capabilities on various sources, generate score cards, create and validate rules and provided data for business analysts for creating new business rules.
  • Used various Informatica PowerCenter and Data quality transformations such as - source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, labeller, parser, address validator, match, comparison, consolidation, standardizer, merge to perform various data loading and cleansing activities.
  • Good understanding MDM architecture, mappings, Trust and validation rules, Match path, Match Column, Match rules, Merge properties and Batch Group creation.
  • Expertise as Informatica Administrator to setup new Informatica environments and upgrade existing Informatica environments to new versions.
  • Setup and configured IICS, IDQ, PowerCenter, BDM grid and services on 10x/ 9x/ 8x, hands on apply hotfix and EBF in different Informatica products as well as provide the day by day production support activities.
  • Used various AWS Services including EC2, Redshift, Data Bricks, S3 Buckets, AWS Kinesis and IaaS /PaaS/SaaS and Azure services as virtual machine, Blob storage, Data Lake, Data factory, Azure SQL, PostgreSQL.

TECHNICAL BACKGROUND:

ETL: Informatica 10x/9x/8x, IICS, PowerCenter, Data Quality, BDM, MDM.

Database: SQL Server 2016/ 2012, SQL Azure, Snowflake, Redshift, RDS, Oracle 11g/10g

OLAP Tools: OBIEE11g/ OBIEE10g, Business Objects, Tableau.

Cloud Services: AWS- EC2, S3, EMR, Kinesis, AMI, Cloud watch, Docker, Redshift, VPC, IPaas.

Azure: Azure VM, Blob, Data Lake, Data Factory, HDInsight, AD, Docker.

Operating Systems/Tool: Red Hat 7/6, Amazon Linux, Ubuntu, Windows Server 2016/201.x, Jupiter notebook

PROFESSIONAL EXPERIENCE:

Confidential, Atlanta, GA

IICS/ IDQ Lead Developer

Responsibilities:

  • Designing, developing and deploying end-to-end Data Integration solution.
  • Extracted flat files from storage S3 bucket to Snowflake database using IICS Data integration services.
  • Secure Agent setup on instances and loaded files from S3 into a warehouse using Data Integration services (IICS) as well as extracted staging from heterogeneous databases to cloud warehouse.
  • Data services/ API Web services are created for third party applications using application Integration.
  • Existing system On-prem legacy Informatica code migrated into IICS could services.
  • Built Real-time Integration tasks using Informatica Cloud and REST API, web services and SOAP. Extensively used Informatica Cloud - Data Synchronization, Data Replication and mapping configuration during integration of cloud-based applications.
  • Loading and transforming large sets of Structured, Semi-Structured, analysed and extracted on IICS into Redshift/ S3/ Hive/ HDFS.
  • Hands on Snowflake Database, Schema and Table structures and used advanced topics Clone, Time Travel Temporary and Transient tables on different datasets.
  • Migrated SQL Server Database into multi cluster Snowflake environment, data sharing to multiple applications and created snowflake virtual warehouses based on data volume/ Jobs.
  • Hands on PySpark, Amazon EC2, Amazon S3, RedShift, Amazon EMR, Amazon RDS and other services of the AWS family.
  • Worked on Data Profile in various sources, Data Standardization, Address Validation, Matching and Merging.
  • Fuzzy matching strategies identity, hamming distance, Bigram are used for Name match, Organization name Address, policies, compare 2 addresses to generate match score though WSDL web service and batch mode.
  • Created Data governance business rules and WSDL web services as real time and batch mode, corrected failed data governance rules to re-validate real-time data submitted through integrated front- end application.
  • Optimized informatica application WSDL web services performance from 900 milliseconds to 34 milli seconds.
  • Extensively used Informatica Functions (LTRIM, RTRIM, DECODE, ISNULL, IIF, INSTR and date) functions in Transformations.
  • Highly proficient in T-SQL for developing complex Stored Procedures, Triggers, Functions, Views, Indexes, Cursors, SQL joins and Dynamic SQL queries etc.
  • Good understanding MDM architecture, mappings, Trust and validation rules, Match path, Match Column, Match rules, Merge properties and Batch Group creation.
  • Prepared documents for data mappings and Data Migration document for smooth transfer of project from development to test environment and then to production environment.

Environment: Informatica Intelligent Cloud Services (IICS), Informatica Data Quality, 10.x, Snowflake, Redshift, RDS, Amazon Elastic MapReduce, Spark, Hive, Python, SQL Server 2016.

Confidential, Redmond, DC

Sr IDQ/ ETL Developer/ Administrator

Responsibilities:

  • Responsible for requirement definition and analysis in support of Data Warehousing efforts.
  • Expertise on Informatica Data Quality transformations like Address validator, Parser, Labeller, Match, Exception, Association, Standardizer and other significant transformations Experience in configuring.
  • Data Profiling, Data Standardization, Address Validation, Matching and Merging.
  • Designed several Processes on Informatica Data Quality and exposed them as RESTful API services to publish data to external systems.
  • Experience in development of mappings in IDQ to load the cleansed data into the target table using various IDQ transformations. Experience in data profiling and analysing the scorecards to design the data model.
  • Experience with Informatica Advanced Techniques - Dynamic Caching, Memory Management, Parallel Processing to increase Performance throughput.
  • Developed Informatica Workflows and sessions associated with the mappings using workflow manager; Developed mapplets, reusable mapplets, mappings and source/ target definitions.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like CSV, XML, SQL server and Flat files and loaded into SQL Server/ Azure SQL.
  • Created Pipelines in Python using Datasets/Pipeline to Extract, Transform and load data from different sources like Azure SQL, Blob storage, Data Lake, Azure SQL Data warehouse.
  • Responsible for defining mapping parameters and variables and session parameters according to the requirements and performance related issues.
  • Build Power BI reports using output source file from Blob storage
  • Responsible for design, implementation of Informatica 10.x/9.x platform and continue to support existing Informatica 8.x platform.
  • Upgraded Informatica from 9.x to 10.x and setup Informatica PowerCenter Disaster Recovery and installed Informatica Hotfixes and EBF (emergency bug fix) on servers and updated windows/ Linux security patches on monthly basis.
  • Configured Informatica Data Quality (IDQ) components like Model Repository Service, DIS Data Integration Service, Content Management services, Web services and Business Glossary.
  • Configured Active Directory LDAP on Admin console to authenticate authorize Developers/ Business users.
  • Debug and troubleshoot web services logs/ mapping logs to find performance issue in web service response.
  • Hand on experience in deploying Informatica data quality (IDQ)/ PowerCenter components likes mappings, rules, Applications, workflows and their dependencies using advanced import and export method/ command utilities.

Environment: Informatica10.x/9.x, SQL Server 2016, Azure SQL, Python, RDS, Blob Storage, Data Lake, Data Factory.

Confidential, FL

Sr ETL Developer

Responsibilities:

  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Informatica Integration Suite.
  • Extensive experience in developing complex mappings in Informatica to load the data from various sources using different transformations like Source Qualifier, Lookup, Expression, Update Strategy etc.
  • Worked with Informatica Data Quality 10.0 (IDQ) Analysis, data cleansing, fuzzy data matching, data conversion, exception handling.
  • Designed and developed transformation rules (business rules) to generate consolidated (fact/summary) data using Informatica ETL tool.
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time.
  • Extracting, cleansing, aggregating, transforming and validating the data to ensure accuracy and consistency.
  • Experience with Informatica Advanced Techniques - Dynamic Caching, Memory Management, Parallel Processing to increase Performance throughput.
  • Extensively involved in Optimization and Tuning of mappings and sessions in Informatica by identifying and eliminating bottlenecks, memory management and parallel threading.
  • Developed Informatica workflows and sessions associated with the mappings using workflow manager; Developed mapplets, reusable mapplets, mappings and source/ target definitions.
  • Preparing all the DB scripts and Informatica objects for the implementation in production environment.

Environment: Informatica 9.6/9.1, PowerCenter, Data Quality, Amazon Cloud, Oracle, SQL server 2016/2014, Windows Server 2016/14.

Confidential, WI

Informatica Developer/ Administrator

Responsibilities:

  • Developed ETL programs using Informatica to implement the business requirements.
  • Hands on in all phases of SDLC from requirement gathering, design, development, testing, Production, user and support for production environment.
  • Modify the Informatica mappings, transformations, sessions and workflows in Informatica PowerCenter Designer/Manager if any change is requested from clients.
  • Responsible for creating Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.
  • Responsible for Defining Mapping parameters and variables and Session parameters according to the requirements and performance related issues.
  • Created various tasks like Event wait, Event Raise and E-mail etc.
  • Created Shell scripts for Automating of Worklets, batch processes and Session schedule using PMCMD.
  • Responsible for design, implementation of Informatica 8.x/9.x platform and continue to support existing Informatica 8.x platform.
  • Upgraded Informatica from 8.x to 9.x and setup Informatica PowerCenter Disaster Recovery and installed Informatica Hotfixes and EBF (emergency bug fix) on servers and updated windows/ Linux security patches on monthly basis.
  • Configured Active Directory LDAP on Admin console to authenticate authorize Developers/ Business users.
  • Configured Informatica Data Quality (IDQ) components like Model Repository Service, DIS Data Integration Service, Content Management services, web services and Business Glossary.
  • Performed System level health checks CPU, Memory utilization, Number of parallel loads (sessions) running on each node and provided recommendations on capacity planning (Disk Space, Memory & CPU etc.)
  • Created Deployment groups/ scripts for migration the code from lower to higher region.
  • Extensively worked on scripts on automation (Auto restart of services, Disk space utilization, clean up the logs directories) and scripts using Informatica Command line utilities.

Environment: Informatica 9.1/8.6 PowerCenter, Power Exchange, Metadata manager, Oracle, SQL server 2014/2012, DB2, RedHat 5.6.

Confidential

Informatica/ OBIEE Developer

Responsibilities:

  • Extracted the data from Flat files and loaded the data into Data warehouse using Informatica PowerCenter.
  • Coordinating with source system owners, day-to-day ETL progress monitoring, Data warehouse target schema Design (Star Schema) and maintenance.
  • Worked on Informatica PowerCenter - Source Analyzer, Data warehousing designer, Mapping Designer & Mapplets, and Transformation Developer, Informatica Repository Manager and Informatica Workflow Manager.
  • Various kinds of the transformations were used to implement simple and complex business logic. Transformations used are connected & unconnected lookups, Router, Expressions, source qualifier, aggregators, filters, sequence Generator, etc.
  • Created and Configured Workflows, Worklets and Sessions to transport the data to target tables in warehouse using Informatica Workflow Manager.
  • Created different types of Customized Reports Drilldown, Aggregation to meet client requirements.
  • Created business reports and dashboard using BI Answers as per requirements.
  • Generated various Analytics Reports using global and local filters.
  • Implemented data, object level security& role-based security, created managing session/repository and presentation variables.
  • Designed Schema/Diagrams using Fact, Dimensions, Physical, Logical, Alias and Extension tables.
  • Built Business model and established relationships & Foreign Keys Physical & Logical between tables.

Environment: Informatica 8.6, OBIEE 10.1.3.4, DAC10g, Oracle10g, SQL server 2008.

Hire Now