Integration Architect Resume
San Diego, CA
SUMMARY
- Have around 13 years of Hands - on developer experience in Data warehouse & Business Intelligence platform using Informatica Power Center,, Informatica Cloud (IICS) Data & Application Integration, Informatica Data Quality, Informatica MDM, Informatica Analyst, B2B, Transformation Studio, Salesforce.com, FinancialForce, ServiceNow, Oracle(SQL & PL/SQL), SQL Server, Teradata, UNIX/Linux, Python script, Power shell script, Unix Shell scripting, Tableau, Business Objects, Cognos Reports, Pentaho Data Integration (Kettle)
- Worked in Software Product, Healthcare provider, Health insurance, Banking/Financial, Manufacturing companies.
- Developed IICS code such as Data Synchronization, Data Replication, Mappings using Processes, Service Connectors, Business Service etc.
- Extensively have written Python programs for File handlings.
- Expert in the Data warehouse concepts and methodologies such as Ralph Kimball and Bill Inmon models
- Have extensively worked on OLTP, OLAP, Dimensional modeling -Star schema, Snowflake schema etc, Operational Data store, Conceptual Data Base Designing, B2B exchange, Web Services/API via REST/SOAP/OData/JSON.
- Certified in Informatica mapping designer 7.x, Informatica Data Quality 8.6, and Oracle SQL.
- Have used most of the transformations in Informatica Power center, integrated with all kinds of heterogeneous systems such as Flat files, COBOL file, XML, RDBMS and CSV files.
- Worked on IDQ for Address cleansing using Address doctor and for source data profiling to generate scorecards.
- Expertise in developing Master Data Management solutions using Informatica MDM. Proficient in configuring and creating Landing, Staging and Base tables in Informatica MDM Hub.
- Expertise working with Informatica MDM components Hub Console, Hub Store, Hub Server and Cleanse Match Server used in building and administering MDM solution.
- Experience working with Informatica Data Director (IDD) for implementing data governance solution for MDM applications.
- Have good experience in Salesforce.com, CRM, Creation of objects, Validation of objects, Upsert, Delete. Also proficient in Sales force Data loader utility to carry out data scrubs such as Insert, update. Have used Force explorer to view the data in the backend.
- Have worked and have good knowledge on the scheduling tools like Autosys, Tivoli, Tidal.
- Exposure to Netezza, BODI, ODI, OBIEE, Big data, Data Science, Hive, Hadoop, MapReduce, Spark etc
TECHNICAL SKILLS
ETL: Informatica Cloud(IICS), Informatica Power Center 10.x/9.x/8.x/7.x, Pentaho Data Integration (Kettle), Informatica Power Exchange 9.x, Exposure to ODI, BODI
Data Quality / MDM Tools: Informatica MDM Hub, IDE, Informatica Data Quality(IDQ) 9.x/8.x,, Informatica Data Director IDD, Informatica Developer
Cloud: Salesforce.com (SFDC), Financial Force, Service Now, Workday, Cloud Jira
Reporting: Tableau 9.2/10, Business Objects XIR2, Cognos 8.4, Jaspersoft iReports, Exposure to OBIEE
Programming/Scripting: Python 3.x, Power Shell, UNIX Shell, SQL
Databases: Oracle 12c/11g/10i/9i, Teradata V2R5, SQL Server 2008 R2, Netezza (Trained)
Tools: /Utilities: Salesforce Data loader, SQL Developer, Toad, Tivoli, Autosys, Rapid SQL, Cloud JIRA, Jenkins, HP QC, HP ALM, Force Explorer, Erwin
SCRM Tools: VSS, Star Team, Tortoise SVN, Tortoise CVS
PROFESSIONAL EXPERIENCE
Confidential, San Diego, CA
Integration Architect
Technology/Tools: Informatica Cloud (IICS) - Cloud Data Integration & Cloud Application Integration, ServiceNow, Salesforce, Financial Force, Workday, Python script, Power Shell, Web Services/API via REST/SOAP/OData/JSON, Visual Studio Code, Cloud JIRA, Agile/Scrum
Responsibilities:
- Developing Informatica Cloud Data and Cloud Application integration Jobs to integrate various applications
- Creating XSD schema and parsing XML files using Cloud Data Integration
- Writing Python scripts to parse and extract the contents from XML in JSON format and to manipulate XML nodes and values based on the business requirement.
- Developed Python programs which invokes ServiceNow attachment APIs (binary and multipart) to upload the payload files.
- Writing Python programs for file movements and archival
- Migrating and Deploying Code to higher Environments (e.g. DEV to QA to UAT to PROD) - Working in Agile Development and Delivery Environment
- Working closely with the business users and product team to understand the data mapping requirements and transformations.
- Worked on modifying the existing Power shell scripts for the change in requirement.
- Working partly on FinancialForce to ServiceNow integration for Customer data and Workday integration for HR/Payroll/Expense data.
Confidential
Lead Informatica Developer
Technology/Tools: Informatica Cloud (IICS), SFDC, Oracle 11g, SQL server, Toad, Autosys, JIRA, Agile Scrum
Responsibilities:
- Developing Informatica Cloud Jobs to integrate Oracle based legacy system ESB (Enterprise Solution for Bankers) with SFDC.
- Creating & Scheduling Informatica Cloud Data Replication and Data Synchronization Jobs
- Preparing Code Migration documents and closely work with DBA for Performance Tuning/Deployment
- Migrating and Deploying Code to higher Environments (e.g. DEV to QA to UAT to PROD) - Working in Agile Development and Delivery Environment
- Creating Custom Objects, Custom Fields, Reports and Dashboards in Salesforce
- Using Autosys to schedule all the jobs and closely work with the production team
Confidential, Grand Rapids, MI
Sr. Informatica Developer
Technology/Tools: Informatica Power Center 10.1, Informatica Analyst, Informatica MDM, Informatica Developer, Oracle 11g, XML, Flat files, RDBMS, Unix Shell Scripting, Tortoise CVS, JIRA
Responsibilities:
- Partnered with integration architects to document the detailed technical design and specifications for ETL transformations, mappings & MDM Base Objects.
- Develop Informatica mappings to build a Data warehouse from various sources like Oracle, flat files, XML files. Generate different kinds of targets such as Flat file, Oracle and XML when needed.
- Designed and developed MDM Base Objects and hierarchies using Informatica MDM Hub Console 10.1.
- Worked on Stage, Load and Match/Merge Jobs in MDM Hub Console. Implemented Match/Merge rules for creating golden records in Base Objects.
- Used MDM Hub Services Integration Framework (SIF) for integrating MDM Hub data with external systems.
- Worked on configuration for data stewards in Informatica Data Director (IDD).
- Developed Mapplets such as to generate PK/FK for the tables in any database.
- Worked on provider hub MDM solution for merging the provider ids.
- Have created data profiles on Informatica Analyst tool
- Worked on Data quality enhancements using Informatica developer I(DQ). Used Address doctor for address cleansing of the Patient/Member demographic information.
Confidential, San Ramon, CA
Sr. Informatica Developer
Technology/Tools: Informatica Power Center 9.x, Informatica Developer 9.6 (IDQ), Oracle 12c/11g, flat files, RDBMS, Unix Shell Scripting, SVN, HP QC
Responsibilities:
- Developed Informatica mappings to build a data warehouse(Dimensional model) from various sources like Oracle, flat files, XML files
- Generated different kinds of targets such as Flat file and Oracle.
- Pilot project - Integration with Salesforce.com using Informatica Power Center to pull accounts, contacts data.
- Developed ETL routines using Informatica Power Center and created mappings involving transformations like Lookup, Aggregator, Ranking, Expressions, Mapplets, connected and unconnected stored procedures, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers.
- Created IDQ mappings and profiles for Source Data (KDE) profiling and produced score card, frequencies, values, charts etc to view number of null values, wrong date formats etc. Generated scorecards/trend charts
- Developed a reusable mapplet to generate PK/FK for the tables in any database.
- Extensively used Mapping Variables, Mapping Parameters to execute complex business logic
- Designing the dimensional model and data load process using SCD Type II.
- Design and development of complex ETL mappings making use of Connected/Unconnected Lookups, Normalizer, Stored Procedures transformations.
- Proficient in using Source Analyzer, Warehouse Designer, Transformation Designer, Mapping Designer and Mapplet Designer.
- Used debugger in debugging some critical mapping by setting breakpoints and trouble shot the issues by checking sessions and workflow logs.
- Involved in identifying bottlenecks in source, target, mappings and sessions and resolved the bottlenecks by doing Performance tuning techniques like increasing block size, data cache size, sequence buffer length.
- Developed UNIX shell scripts to create parameter files, rename files, compress files and developing script for scheduling periodic load processes.
- Responsible for the documentations of the different processes carried out like technical design documents and field mapping documents, Unit test cases and logs, share point for the version control of the documents and SVN for the version control of source code for all the interfaces in the Interface Project.
- Worked closely in setting up the environment for various file transfer activities between the systems using SFTP as the file transfer protocol.