We provide IT Staff Augmentation Services!

Sr Informatica Developer/data Analyst Resume

South San Francisco, CA

PROFESSIONAL SUMMARY

  • Over Ten years of experience in IT industry with expertise in Analysis, Design, Development, Implementation, Testing, and Support of DataWarehouse and MDM applications across Telecom, Sales, Retail and Banking domains.
  • Extensive Experience in Informatica Power Center 9.x/8.x/7.x (Designer, Repository manager, Workflow Manager, Workflow Monitor) and Informatica Data Quality tool (IDQ), Informatica Analyst tool.
  • Used various complex PowerCenter Transformations like lookup, Joiner, Expression, Router, Update strategy, Source Qualifier, Aggregator, filter, Sequence Generator, Normalizer to accomplish the mapping design.
  • Excellent knowledge in IDQ transformations like Parser, Labeler, Match, Consolidation and Address validator, Bad Record Exception etc. transformations in Informatica Data Quality tool (IDQ).
  • Strong skills on Informatica MDM (Master Data Management) for golden record and mastering process.
  • Well versed with Relational and Dimensional modeling techniques like Star, Snowflake Schema, OLAP, Dimensional and Fact Tables.
  • Experience in Performance Tuning of Targets, Sources, Mappings and Sessions in Powercenter.
  • Extensive database experience using Oracle 11g/10g/9i, Teradata, MS SQL Server 2012/2008/2005, Teradata and Greenplum.
  • Working knowledge of HiveQL and R programming.
  • Developed Complex database objects likeStored Procedures, Analytic Functions and Triggers.
  • Experience with Teradata as the target for the DataMart, worked in Teradata utilities like BTEQ, FastLoad and MultiLoad.
  • Exposure in reporting tools like Salesforce Wave Analytics, OBIEE, MicroStrategy and Tableau.
  • Sound knowledge on major technologies in Hadoop Ecosystem such as Hive, Pig, HBase and Sqoop.
  • Excellent knowledge
  • Highly effective in maintaining team dynamics and lead team to successful project delivery.
  • Proven ability to work efficiently in both independent collaborative environments with Excellent interpersonal and communication skills.
  • Motivated team player and proficient working in a process centric environment like Agile/Scrum.
  • Provided Informatica training at Accenture Greenfield Training (GFT).

TECHNICAL SUMMARY

Data warehousing Tools: Informatica MDM 10.0.1, Informatica Power Center 9.x/8.x/7.x, Informatica IDQ 9.6.1, Informatica Power Exchange 9.6.5 and 8.6., Oracle data Integrator (ODI), SSIS

Reporting Tools: Salesforce Wave Analytics, MicroStrategy, Business Objects R4,SSRS,Tableau

Modeling Tools: Rational Software Architect, Erwin, Visio

Databases: Oracle 11g/10g/9i, MS SQL Server 2012/2008/2005, Teradata, MS - Access

Programming: Java 1.5,SQL, PL/SQL, UNIX Shell Scripting, Python and R

Big Data Technologies: Hive,Pig

Scheduling Tools: Autosys, DAC, Control M, Tidal

Versioning Tools: SharePoint, Rational Team Concert, Perforce, Team Foundation Server, GitHub

OS: Win 2000, Win XP, UNIX, Linux, Windows 7,Windows 8

Other Tools: Toad, SQL developer, Soap UI, Google Geocoding API

Application server: JBoss

IDE: Eclipse

PROFESSIONAL EXPERIENCE

Confidential, South San Francisco, CA

Sr informatica Developer/Data Analyst

Responsibilities:

  • Requirement gathering and documentation about business problem (Diversion and duplicate discounts).
  • Analysis of data from different data provider sites like census bureau, Health Resources & Services
  • Administration, CMS (Medicare provider),Hospital service area etc.
  • Created ETL design and Development of mappings, workflows, Unit Testing and peer review.
  • Integrated charge back data, indirect sales data, population, cancer data and Hospital service area from different sources to identify the diversion.
  • Used R Programming and Data mining tools to identify the patterns in diversion.
  • Used Python scripting to automate Medicare provider reference data.
  • Created supporting data sheets (tableau reports) using tableau to give presentation about the findings.
  • Automated the entire manual file download process from external source system with R programming.
  • Performed data profiling and data quality checks in IDQ.
  • Created reusable mapplet to calculate the TAT for access solutions (case management).
  • Provided work estimation to clients and managed task allocation to offshore team.
  • Worked directly with the business to validate the findings of diversion and potential threats.

Environment: Informatica Powercenter 9.6.1, Flat Files, Oracle 11g, Unix, Redwood, Informatica IDQ 9.6.1, Informatica Analyst, Informatica MDM 10.1, Address doctor, JIRA, Tableau, Rstudio, Python, Orange

Confidential, San Francisco, CA

Data Analyst

Responsibilities:

  • Created data model for data reconciliation framework from different source system.
  • Created ETL design and Development of mappings, workflows, Unit Testing and peer review.
  • Creation of the Base objects, Staging tables and relationships in MDM.
  • Experience with Address doctor and Google API for cleansing party address.
  • Involved in queries and package development.
  • Development of the MDM mapping and cleanse functions.
  • Worked closely with the ETL data architect to populate landing tables in MDM.
  • Conducted analysis sessions with business users to understand matching requirement.
  • Set up match/merge and run match rules to check the effectiveness of MDM process on data.
  • Provided production support on rotation basis and fixed production failures with in the SLA.
  • Escalate production issues and remediation steps effectively and in a timely manner.
  • Perform troubleshooting analysis and resolution of critical applications and batch processes.
  • Coordinate with vendors to troubleshoot and resolve issues.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Created Tableau dashboards to provide an insight on data reconciliations issues from different source systems.
  • Used SIF APIs (GET, SearchMatch, PUT, CleansePut, ExecuteBatchDelete etc.) to test search, update, cleanse, insert and delete of data from SoapUI.

Environment: Informatica MDM 10.1, IDD, Address Doctor, Oracle 11g, Control M, Informatica 9.6.1, Informatica IDQ 9.6.1, Informatica Analyst, Google API, Tableau

Confidential

Data warehousing.

Responsibilities:

  • Worked closely with the clients to understand the requirements.
  • Created data model and ETL design according to requirements and developed informatica jobs.
  • Developed data visualization reports/dashboards using Tableau.
  • Performed data profiling and data quality checks in IDQ.
  • Create score cards in IDQ and helped Business Analysts to define survivorship rues in MDM.
  • Provided work estimation to clients and managed task allocation to offshore team.
  • Created Reference tables and developed a framework to publish these data to Business analysts to update using Human Tasks in IDQ.
  • Developed IDQ mapplet using Address validator and invoked the same through power center.
  • Used Bad Record Exception transformation in Informatica Data Quality tool (IDQ) to publish the bad data for Business Analyst’s review in Informatica Analyst tool.
  • Development of mappings, workflows, Unit Testing and peer review.
  • Handled code migration to higher environments.
  • Migrated huge volume of data from SQL server to Teradata (marketing department)
  • Implemented File watcher in shell script and created different shell script for post processing files.
  • Provided production support on rotation basis and fixed production failures with in the SLA.
  • Created audit mechanism for different vendor files.

Environment: Informatica Powercenter 9.6.1, Flat Files, SQL Server 2012, Oracle 11g, Linux, Control M, Informatica IDQ 9.6.1, Informatica Analyst, Informatica MDM 10.1, IDD, Address doctor, Google API, Tableau,Teradata

Confidential, San Francisco, CA

Sr Informatica developer

Responsibilities:

  • Worked closely with Business Analysts to close the BRD (Business Requirement Document).
  • Created technical design documents and mapping documents for ETL.
  • Created Informatica mappings, sessions and workflows and wave upload jobs.
  • Uploaded external data files via API call and specifying metadata file structure in JSON format.
  • Redesigned existing mappings to improve performance.
  • Created Unix Scripts and modified existing scripts for re usability.
  • Validated json file and XMD file for datasets created by ETL process.
  • Scheduled ETL jobs/shell scripts in Tidal framework.
  • Conducted pre UAT sessions for Business Users.
  • Analyzed issues found in UAT and fixed bugs.
  • Escalate issues and remediation steps effectively and in a timely manner.
  • Perform troubleshooting analysis and provide resolution with in the SLA.
  • Experience in production support and resolved on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Implemented row level security predicates for Sales KOA data sets.
  • Performed data validation in Salesforce Wave Analytic App (Analytics Cloud).

Environment: Informatica 9.5.1, Oracle 11 g, Confidential, Power exchange for salesforce, Tidal, Tidal Transporter, UNIX, Salesforce Wave Analytics Cloud, Flat files, Perforce, Json editor.

Confidential

Sr Informatica developer

Responsibilities:

  • Analyzed old ETL jobs in Business Object Data integrator and create the design approach.
  • Analyzed the impact on existing data warehouse to make it in-sync with legacy system.
  • Created design document and source to target mapping document for legacy migration.
  • Responsibilities included designing and developing complex informatica mappings including Type-II slowly changing dimensions.
  • Provided design walkthrough to Architects and finalized the design.
  • Extracted data from Salesforce cloud and migrated to new data warehouse.
  • Developed mappings, sessions, and workflows and check in deployment package at Perforce.
  • Created Tidal jobs in Development and Test environment.
  • Did performance tune of mappings.
  • Performed peer reviews and tested deliverables of peers.
  • Created release notes for each deployment.
  • Supported system integration and provided bug fixes in production.

Environment: Informatica 9.5.1, Business Object Data Integrator, Oracle 11 g, Tidal, Tidal Transporter, UNIX, xml files, Flat files, Perforce, Power exchange for salesforce.

Confidential

Sr Informatica Developer

Responsibilities:

  • Worked closely with Business Analysts to ensure data quality of receiving files.
  • Worked withInformaticaData Quality toolkit, Analysis, data cleansing, data matching, data conversion, capabilities of IDQ
  • Identified and eliminated duplicates in datasets through IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher.
  • Used Address doctor to validate and correct address.
  • Designed ETL design according to the file arrival and creates TSD.
  • Extensive extraction of data from Flat Files to Green Plum and write back to flat files.
  • Development of mappings, workflows and Unit Testing.
  • Designed and Developed pre-session, post-session routines and batch execution routines.
  • Debugging and Trouble Shooting of Mappings.
  • Created dashboards and reports in Micro strategy.
  • Provide Informatica code job walkthrough to QA team and defect fixing.
  • Offshore Coordination of development activities.

Environment: Informatica 9.5.1, Flat Files, Green Plum, UNIX, PostGre SQL, AutoSys, Perforce, Informatica IDQ, Micro strategy.

Confidential

Solution Designer/ Development Lead

Responsibilities:

  • Worked with the Business Users and Business Analysts for gathering functional requirements.
  • Created Data model and Solution design documents and Technical design documents.
  • Designed Universe and Reports for Reporting (Business Objects) environment.
  • Provided support to offshore development team in understanding the Design.
  • Extensively used PL/SQL programming procedures, packages to implement business rules.
  • Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Workflow Manager.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Developed Procedures and Functions in PL/SQL for ETL.
  • Extensively used ETL to load data from source systems like Flat Files into staging tables and load the data into the target database Oracle.
  • Created and used reusable Mapplets and transformations.
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Designed and Developed pre-session, post-session routines and batch execution routines.
  • Involved in debugging and Trouble Shooting of Mappings.

Environment: Informatica Power Center 9.1.1, Oracle 10g, UNIX and PL/SQL Developer, HP-IUX, BSCS, Siebel CRM, EAI, IN-Eserve, Business Objects, RSA

Confidential

ETL developer

Responsibilities:

  • Analyze the customization impact on Siebel CRM and propagate the changes to prebuilt mappings.
  • Create high level design and mapping, Sessions and workflows.
  • Worked in Topology manager, contexts, physical schemas
  • Created Deployment of workflows to higher environment using Informatica Import Export Utility.
  • Performed mapping Review using Informatica Code Review Tool.
  • Developed Pre and Post SQL scripts, PL/SQL stored procedures and functions.
  • Used Debugger to check the errors in mapping.
  • Generated UNIX shell scripts for automating daily load processes.
  • Creation of test cases and execution of test scripts as part of Unit Testing.
  • Involved in quality assurance of data, automation of processes.
  • Documented the entire process. The documents included the mapping document, unit testing document and system testing document among.
  • Effectively handling change requests by developing, implementing and testing of solutions.
  • Review of mappings and workflows developed by Peers and report the defects.
  • Created different types of CustomizedReports Drilldown, Aggregation for Sales and Productto meet client requirements.
  • Involved inRPD Development,Oracle BI Administration,Oracle BI Answers andBI Publisher.
  • Managed Change control implementation and coordinating daily, monthly releases.

Environment: Informatica Power Center 7.1.4, OBIA, Oracle 10g, DAC, ODI, UNIX and PL/SQL, OBIEE

Confidential

ETL developer

Responsibilities:

  • Involved in Business Analysis and requirement gathering.
  • Designed the mappings between sources to operational staging targets.
  • Used Informatica Power Center as an ETL tool for building the data warehouse.
  • Employed Aggregator, Sequencer and Joiner transformations in the population of the data.
  • Used Teradata as a Source and a Target for few mappings.
  • Worked with Teradata loaders within Workflow manager to configure FastLoad and MultiLoad sessions.
  • Data transformed from SQL Server databases and loaded into an Oracle database.
  • Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Creation of test cases and execution of test scripts as part of Unit Testing.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Creation of Transformations like Lookup, Joiner, Rank and Source Qualifier Transformations in the Informatica Designer.
  • Created complex mappings using Unconnected Lookup, Sorter, Aggregator, newly changed dynamic Lookup and Router transformations for populating target table in efficient manner.
  • Created Mapplet and used them in different Mappings.
  • Worked extensively on Power Center client tools like Source Analyzer, Warehouse designer, Transformation developer and Mapping Designer, Workflow Designer.
  • Created Unix Script for Audit trail.
  • Worked in production support and fixed failures with in the SLA.

Environment: Informatica Power Center 8.6.1, Oracle 10g, UNIX, Oracle Apps

Hire Now