We provide IT Staff Augmentation Services!

Sr. Informatica Power Center/cloud Developer Resume

MilwaukeE

SUMMARY

  • Over 9 years of experience in the IT industry with a strong background in software development and 8+ years of experience in Development & Testing Business Intelligence solutions in data warehousing and decision support systems using ETL tool Informatica Cloud, Data Governance, Master Data Management , Informatica Power Exchange, Informatica Data Quality, Informatica Power Center.
  • Experience in various domains like HealthCare, Finance, Telecom, Insurance, Agriculture & Forestry and Banking.
  • Strong in Data warehousing methodologies of Star /Snow Flake schemas of Ralph Kimball, Bill Inman.
  • Architected/Developed Informatica Batch/Real - time(CDC) processes to feed the Informatica MDM (Master Database Management) serving as a single access of customer data between applications.
  • Took part of creating the Data Architecture Guidelines and Data Governance Framework in Insurance and Financial.
  • Designed the real-time analytics and ingestion platform using Spring XD and Flume. Hands on experience with multiple NOSQL databases including Mongo DB and HBase.
  • Extensive experience with Data Governance and Informatica Data Quality 9.6 (IDQ/IDE) tool kit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ 9.1,9.6.
  • Expertise in Integrations with Salesforce.com and backend Interfaces (Legacy systems like Siebel, SAP) by using Web Services and Web Methods, Informatica as an integration layer.
  • Worked on various Salesforce.com standard objects like Accounts, Contacts, Leads, Opportunities, Dashboards and Reports.
  • Worked with Informatica power exchange and Informatica cloud to integrate Salesforce and load the data from Salesforce to Oracle db.
  • Ability to write complex SOQL queries across multiple objects within the SFDC database. Created and deployed several workflows, and Reports using salesforce.com platform.
  • Experienced working with Informatica Big Data Analytics with Hadoop - Horton works.
  • Experienced in Implementing Big Data Technologies - Hadoop ecosystem/HDFS/ Map-Reduce Framework, HBase, Sqoop, Pig, Oozie and HIVE data warehousing tool.
  • Expert in understanding the data and designing/Implementing the enterprise platforms like Hadoop Data Lake and Huge Data warehouses.
  • Developed/Created Datasets, Salesforce Wave Reports, Dashboards and Approvals to continuously monitor data quality and integrity. Expertise in Reporting, Customizing the Dashboard and Scheduling Dashboard Refreshing.
  • Created visually impactful dashboards in Excel and Salesforce Wave for data reporting by using pivot tables and VLOOKUP. Extracted, interpreted and analyzed data to identify key metrics and transform raw data into meaningful, actionable information.
  • Have good understanding of Tableau architecture, design, development and end user experience.
  • Extensive experience in working with Tableau Desktop, Tableau Server and Experience in using Tableau functionalities for creating different Requests, Filters, Charts, Interactive dashboards with Page and dashboard Prompts.
  • Experienced in the use of agile approaches, including sprint planning, daily stand-up meetings, reviews, retrospectives, release planning, demos, Extreme Programming, Test-Driven Development and Scrum.
  • Worked as an ONCALL production specialist with primary and secondary duties.
  • Analyzed and resolved the incidents raised by the application users on priority (low, medium and high) through Enterprise support tickets.

TECHNICAL SKILLS

ETL: Informatica Power center 10.1,9.6, 9.1, 8.x, 7.x, Informatica Cloud IICS, Informatica IDQ, Informatica Analyst, Power Exchange CDC, Informatica BDE

BI Tools: Salesforce Wave Analytics, Business Objects, Tableau 9.x

Big Data Ecosystems: Hadoop, Map Reduce, HDFS, Hive, Pig, Sqoop, Oozie, Spring XD

Operating Systems: Windows 95/98/2000/2003 Server/NT Server Workstation 4.0, UNIX

Programming: Java, R, PIG, Hive, C, SQL, PL/SQL, HTML, XML, DHTML

Other Tools: Eclipse, SQL*Plus, TOAD, MS Visio, TOAD 8.0, Aginity, CA workstation, ESP

Scripting Languages: SQL, PL/SQL, UNIX Shell Scripting

Methodologies: Agile,E-R Modeling, Star Schema, Snowflake Schema

Data Modeling Tool: Erwin 3.5/4.1

Databases: Oracle 11G,9i/8i/7.3, MS SQL Server 2008, DB2, Netezza, Sybase, Teradata

PROFESSIONAL EXPERIENCE

Confidential

Sr. Informatica Power Center/Cloud Developer

Responsibilities:

  • In-depth practical knowledge of all modules and features of Sales Force both Sales and Service Cloud. Also, have good exposure in other areas of project execution like Customer facing, Requirement gathering and analysis, Consulting, Solution Designing, Documentation, Implementation, Development and support of Business Solutions.
  • Involved in extracting, transforming and loading data Opportunities, Accounts, Users, Leads, Contacts, Record type, Opportunity History, Tasks, Events interactions tables from various source systems to Salesforce.com and reverse data feed from Salesforce for CRM Honeywell.
  • Worked with Informatica Cloud/Power Center to create Source /Target connections, monitor, and synchronize the data in SFDC.
  • Automated Validation and De-duplication of Salesforce data using Informatica Cloud Customer 360 CC360.
  • Implemented effortless Consolidation and Integration of hierarchical data from multiple systems to provide a Single View of Customer using Informatica CC360.
  • Worked in Data Cleansing and mapping data from source salesforce.com to target Oracle.
  • Data Migration (from salesforce.com to Oracle) using Informatica Cloud and Power Center.
  • SOQL queries to fetch the data from Workbench and Explorer. Designed and developed ETL and Data Quality mappings to load and transform data from sources such as Oracle and SQL to Data warehouse using Power Center and Cloud.
  • Used Teradata, SQL assistant for running the SQL Queries to check the data loaded to the Target Tables.
  • Performed data profiling and analysis of various objects in SalesForce.com (SFDC) using IDQ and MS Access database tables for an in-depth understanding of the source entities, attributes, relationships, domains, source data quality, hidden and potential data issues, etc.
  • Implemented Change Data Capture (CDC) on Source data from Salesforce.com.
  • Extracted various data from SalesForce.com using Informatica 10.1 with Sales Force Adapter.
  • Created the dashboards in salesforce wave analytics and Used multi-dimensional analysis (Slice & Dice and Drill Functions) to organize the data along a combination of Dimensions and Drill-down Hierarchies giving the ability to the end-users to view the data from heterogeneous viewpoints.
  • Customized the Salesforce Wave Dashboards to the track usage for productivity and performance of business centers and their sales teams.
  • Scheduled the Informatica jobs using Autosys.

Environment: Informatica Power Center 10.1, Informatica Cloud/ICS, MDM, Informatica REV, Autosys, Data Modelling(Erwin), Teradata, UNIX, Siebel, Oracle 12g, Salesforce.com, Salesforce Wave Analytics, Flat files, XML, Shell Scripting, Putty, WinSCP and Toad.

Confidential, Milwaukee

Sr. Informatica Power Exchange/IDQ Developer

Responsibilities:

  • Involved in Design and develop the architecture for all data warehousing components e.g. tool integration strategy; source system data ETL strategy, data staging, movement and aggregation, information and analytics delivery and data quality strategy.
  • Designed and developed ETL and Data Quality mappings to load and transform data from sources such as DB2, Oracle and Sybase to Data warehouse using Power Center and IDQ/IDE.
  • Extensively used Informatica Data Quality (IDQ) transformations like Match, Consolidation, Exception, Parser, Standardizer and Address Validator.
  • Developed IDQ Match and Merge strategy and Match and consolidation based on customer requirement and the data.
  • Built several reusable components in IDQ/IDE using Parsers Standardizers and Reference tables.
  • Performed data profiling and analysis of various objects in SalesForce.com (SFDC) using IDQ and MS Access database tables for an in-depth understanding of the source entities, attributes, relationships, domains, source data quality, hidden and potential data issues, etc.
  • Worked with the Business Analysts on IDQ - Data Profiling, Data Validation, Standardization and Data Cleansing for the Oracle 12c data migration to rebuild and enhance the business rules.
  • Also, worked with Business Analysts to modify/enhance rules for physical/mailing addresses using IDQ -Address Validator.
  • Developed both one-time and real-time mappings using Power Center 9.6 Power Exchange.
  • Registered the Data maps for Real-time CDC Changed Data Capture data in Power Exchange. Worked on Extraction Maps Row Test in Power Exchange Navigator.
  • Implemented Change Data Capture (CDC) on Source data from Salesforce.com.
  • Extracted various data from SalesForce.com using Informatica 9.6 with Sales Force Adapter.
  • Worked on updating Sales Force using external ID and created various objects in Salesforce.
  • Used Teradata, SQL assistant for running the SQL Queries to check the data loaded to the Target Tables
  • Worked on design and development of Informatica mappings, workflows to load data into staging area, data warehouse and data marts in Teradata.
  • Created new and modified existing Hierarchies in the universes to meet Drill Analysis of the user’s reporting needs and Involved in performance tuning and optimization on Business Objects Universes by creating Aggregate tables.
  • Performed integrity testing of the Universes (Universe Structure checking, Object parsing, joins parsing, Conditions parsing, Cardinalities checking, Loops checking and Contexts checking) after any modifications to them in terms of structure, classes, and objects.
  • Used multi-dimensional analysis (Slice & Dice and Drill Functions) to organize the data along a combination of Dimensions and Drill-down Hierarchies giving the ability to the end-users to view the data from heterogeneous viewpoints.

Environment: Informatica Power Center 9.6, Informatica Power Exchange 9.6, Informatica IDQ/IDE 9.6, Autosys, Hadoop Ecosystem, Data Modelling(Erwin), UNIX, Windows 2007 Professional Client, Sybase, Teradata, Oracle 10i, DB2, SAP Business Objects, Flat files, XML and COBOL, Shell Scripting, Putty, WinSCP and Toad.

Confidential, Illinois

Sr. Informatica Developer/IDQ Developer

Responsibilities:

  • Successfully designed and architected the Integrated Data Warehouse in John Deere on Big Data platform.
  • Designed, developed, implemented and maintained Informatica Data quality IDQ/MDM application for matching and merging process.
  • Utilized Informatica IDQ/IDE 9.1 to complete initial data profiling and matching/removing duplicate data.
  • Installed and configured content based data dictionaries for data cleansing parsing and standardization process to improve completeness conformity and consistency issues identified in the profiling phase using IDQ/IDE.
  • Configured Analyst tool IDE and helped data stewards or business owners in profiling the source data create score cards applying inbuilt DQ rules and validating the results.
  • Experienced working with Informatica Big Data - To Read/Write HDFS files, Hive Tables and Hbase.
  • Installed and configured Hadoop MapReduce, HDFS, Developed multiple MapReduce jobs in java for data cleaning and preprocessing.
  • Importing and exporting data into HDFS and Hive using Sqoop.
  • Experienced in running Hadoop streaming jobs to process terabytes of xml format data.
  • Mastered the ability to design and deploy rich Graphic visualizations using Tableau.
  • Working on generating various dashboards in Tableau Server using different data sources such as Netezza, DB2 and Created report schedules, data connections, projects and groups.
  • Expert level capability in table calculations and applying complex, compound calculations to large, complex big data sets.
  • Worked closely with business power users to create reports/dashboards using tableau desktop.

Environment: Informatica Power Center 9.6, Hadoop Ecosystem , Informatica DVO , IDQ , MDM , Power Exchange , Data Modelling( Erwin ), Netezza, PL/SQL, DB2, Sybase, Tableau V8, Shell Scripting, Putty, WinSCP and Toad, Aginity tool.

Confidential

Sr. Informatica Developer/IDQ Developer

Responsibilities:

  • Served as an ETL Developer/Data Quality Analyst in the deployment of FPR Financial Product Reporting and PDS Persistent Data Staging. Primary responsibility is to do the Data Quality checks and data integration using Informatica Data Quality and Power center, Unix Shell scripting, architecting and developing of a custom ETL framework that consisted of over 144 processes using Oracle native language (PL/SQL) and load that in to Oracle DWH.

Environment: Informatica Power Center 9.1, Informatica IDQ/IDE , Power Exchange , Web Services , IDQ , UNIX, Windows 200 Professional Client, Oracle 8i/9i Enterprise Edition, PL/SQL, Teradata, SAP BOXI R2/6.5, VSAM files, Flat files, XML and COBOL, Shell Scripting, Putty, WinSCP and Toad.

Confidential

Sr. Informatica Developer

Responsibilities:

  • Served as a Senior ETL Developer/SQL Developer in the enhancement of an existing custom ETL framework that collected, cleansed, and integrated the company’s performance data (i.e., cash-flows, liquidity positions) from various operational source systems. The enhancements were part of an overall initiative in improving an internal custom built Liquidity Risk Management System (LRSMS) that supported and provided JP Morgan Corporate Treasury executives, senior managers, and business analyst with analytical reporting capability on the company’s liquidity position, sources and uses of cash, forecasting of cash flows and stress test modeling, funding counterparties, and developing funding plan as needed. Major contributions and/or accomplishments included: designing and developing an ETL component that dynamically constructed in real time the SQL to load over 100 source feeds into the Liquidity Position fact table using a Meta Data strategy; designing and developing an ETL process that mapped custom products and/or services hierarchical relationship with the company’s general ledger products for reporting purposes; developing Sybase objects such as tables, views, indexes, triggers, procedures, and functions to support the ETL Meta Data Rule component; providing support, guidance, and training in the deployment of the solution in various environments such as integration, QA, and production.

Environment: Informatica Power center 9.1, Webservices, SQL Server 2008, Oracle 11i/10g, Teradata, PL/SQL, Power exchange 9.1, Sybase, SAP Business Objects XI 3.x, TOAD, Windows XP, UNIX maestro, ERWIN 4.2, Control-M.

Confidential

Sr. Informatica Developer

Responsibilities:

  • Served as an ETL Developer/Data Analyst in the deployment of Claims, Hospital Events, In-Patient Pharmacy, Hospital Billing, and Professional Billing. And so on to the Data Warehouse to make easier access of Patients Claims data, provide analytics, trending and comparisons, reporting and exporting capabilities that support the business needs, but most of all increase data utilization, and improve customer services and productivity. Primary responsibility using Teradata utilities (SQL, B-TEQ, Fast Load, MultiLoad, FastExport, Tpump, Visual Explain, Query man), Teradata parallel support, Unix Shell scripting, architecting and developing of a custom ETL framework that consisted of over 144 processes using Oracle native language (PL/SQL) and load that in to Teradata.

Environment: Informatica Power center 8.6, Power Exchange 8.6 , HIPAA(835,837-Institutional/Professional-Inbound/Outbound),Webservices, Business Objects /6.X, Oracle 11g/10g, PL/SQL, Flat files, XML,COBOL,, Teradata.

Confidential, FL

Informatica Developer

Responsibilities:

  • Served as one of the ETL Informatica Developer involving in Gathering the requirements from the end users and Involved in analysis of source systems, business requirements and identification of business rules. Design and Implement Informatica mappings to Migrate data from various legacy applications/acquisition offices to a centralized application. Responsible for Analysis, Design and Implement of various data marts for BackOffice (Financial, payroll, Benefits and HR Modules) using Data Modeling Techniques, Informatica 7.6. Tuned mappings and sessions for better performance on the data loads.

Environment: Informatica Powercenter (Designer 7.6, Repository Manager 7.6, Workflow Manager 7.6),Power Exchange, Business Objects XI/6.X, Oracle 11g/10g, PL/SQL, SQL*PLUS, SQL Server 2008,2005 Flat files, XML, TOAD, UNIX,, Erwin 4.0 and Shell Scripting.

Hire Now