Sr. Informatica Idq/mdm Developer Resume
- Software Professional wif 9 years of experience in teh field of Information Technology wif an extensive experience in Master Data Management and as an ETL Developer wif prime focus on analysis, design, development, customization and maintenance of various data warehouse applications in an effective way to deliver leading edge software solutions.
- Hands on experience in all aspects of Software Development Life Cycle (SDLC) and Agile/scrum methodologies.
- Hands - on experience wif Informatica MDM Hub Console like working wif Data Mappings from Landing, staging and to Base Objects, Trust and Validation rules, Match Path, Match Column, Match rules, Merge properties and Batch Group creation and involved in customizing and configuring IDD applications
- Expertise in design and configuration of landing tables, staging tables, base objects, hierarchies, foreign-key relationships, lookups, query groups, queries/custom queries and packages
- Excellent experience on designing and developing of multi-layer Web Based information systems using Web Services including Java and JSP.
- Experience in Data Warehouse, Relational Database and System Integration. Proficiency in gathering and analyzing user requirements and translating them into business solutions.
- Efficient in creating source, Confidential databases and developing strategies for Extraction, transformation and loading (ETL) mechanism using Informatica Power Center Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
- Worked wif complex mappings using different transformations such as Source Qualifier, Expression, Filter, Joiner, Router, Update Strategy, Union, Rank, Normalizer, Unconnected / Connected Lookup, Java, Sorter, Sequence Generator and Aggregator.
- Practical noledge on Git, AWS, Docker, Chef, Jenkins, Linux, Python, Java, Teradata, MySQL, Oracle and scripting.
- Experience in Using of Mongo DB as an open source software avoids teh traditional table-based relational database structure in favor of JSON-like documents wif dynamic schemas (MongoDB calls teh format BSON)
- Implementing design and implementation of SCD - slowly changing dimensions types (1, 2 and 3) and CDC - Change data capture.
- Hands on experience in creating ETL transformations and jobs using Pentaho Kettle Spoon designer and scheduling them on Pentaho BI Server.
- Has worked wif technologies including: XML, Java, Hermes JMS, SOAP UI, ASP, VB, Oracle, SQL Server and IBM Mainframes.
- Strong understanding of Dimensional Modeling, OLAP, Star, Snowflake Schema, Fact, and Dimensional tables and DW concepts.
- Experienced in SSIS programming module to code several SSIS packages by using many programming languages.
- Extensive Experience in integration of various data sources like ORACLE 12c/11g/10g, Teradata 14/13, Netezza, UDB DB2, Mainframes, SQL server 2012, SAP, Sybase, Informix, MySQL, Flat Files, MQ series and XML, Flat files.
- Serial/parallel batch processing, Real time ETL including CDC, Queue- MQ series, JMS, TBCO etc.
- Writing and Executing Test cases & defect tracking using Quality Center
- Involved in migration of objects in all phases (DEV, QA and PRD) of project and trained developers to maintain system when in production.
- Good noledge of AutoSys scheduling tool, RedGate Tools, TeamCity (build management & continuous integration server), SVNTortoise (subversion client), Star Team(revision control system)
- Excellent analytical skills. Exhibits great ability to grasp newer technologies. Effective team player wif excellent communication skills. Self-starter, result oriented team player wif an ability to manage multiple tasks simultaneously.
Databases: Oracle 9i/10g, 11g/12c SQL Server 2014, 2012, 2008, 2005
Languages: C, C++, Java, J2EE, Visual Basic, SQL, PL/SQL,Siebel,COBAL, Python and UNIX Shell Scripting.
ETL Tool: Informatica Power Center 10.x/9.x/8.x (Designer, Workflow Manager, Workflow Monitor, Repository manager and Informatica Server)
MDM Packages: Informatica MDM Multi Domain Edition 10.0, 9.7.1, 9.5, 9.1. Informatica Data Director (IDD) 10.0, 9.7, 9.5. Informatica Data Quality(IDQ) 10.0, 9.6, DDM 9.6, SAP
RDBMS: Oracle 12c/11g/10g/ 9i (SQL/PLSQL)
DB Tools: SQL*Plus, SQL Loader, TOAD, OBIEE, BTEQ, Fast Load, Multiload, FastExport, SQL Assistant, Teradata Administrator, PMON, Teradata Manager, Microstrategy, Cognos, Microstrategy, BO
Tools: and Utilities: TOAD 10.1, Text pad, Word Pad, SQL Developer 4.0.3
Modeling Tools: Erwin 4.0 data modeler, ER studio 7.5, MS Visio 2007
Environment: Windows 7/XP/2000, Unix, Windows Server 2003, 2008/Linux
Packages: MS Office (MS Access, MS Excel, MS PowerPoint, MS Word), Visual Studio, Java Eclipse
Scheduling Tools: Autosys
Version Control Tools: Clear Case
Data Methodologies: Logical/Physical/Dimensional, Star/Snowflake, ETL, OLAP, Complete Software Development Cycle. ERWIN 4.0
BI: Microstrategy, SAS, Cognos.
Operating Systems: Sun Solaris 2.6/2.7/2.8/8.0 , Linux, Windows, UNIX
Confidential, Charlotte, NC
Sr. Informatica IDQ/MDM Developer
- Used Address Doctor extensively for North America Address validations. Built several reusable components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.
- Experience in extracting addresses from multiple heterogeneous source like flat files, oracle, SAS and SQL server.
- Created custom rules to validate zip codes, states and segregated address data based on country.
- Created web services for address mapplets of different countries to integrate wif SOAP UI.
- Used Informatica MDM 10.1 (Siperion) tool to manage Master data of EDW.
- Extracted consolidated golden records from MDM base objects and loaded into downstream applications.
- Extensively involved in ETL testing, Created Unit test plan and Integration test plan to test teh mappings, created test data. Use of debugging tools to resolve problems.
- Created reference tables to standardize data.
- Experience in validating data quality & business rules by using Mapping document and FSD to maintain teh data integrity.
- Used Python scripts to update content in teh database and manipulate files.
- Worked wif team of developers on Python applications for RISK management.
- Experience in writing SQL test cases for Data quality validation.
- Worked extensively in Micro Strategy Administration like Creation of new users, roles, privileges, shared folders, access controls list
- Experience in various data validation and Data analysis activities to perform data quality testing.
- Experience in Investigating and communicating data quality issues and data failures to onsite DQ development team and fix them.
- Experience in end to end Data quality testing and support in enterprise warehouse environment
- Experience in maintaining Data Quality, Data consistency and Data accuracy for Data Quality projects.
- Provided production support to schedule and execute production batch jobs and analyzed log files in Informatica 8.6/& 9.1 Integration servers.
- Experience in Data profiling and Scorecard preparation by using Informatica Analyst.
- Strong noledge in Informatica IDQ 9.6.1 transformations and power center tool.
- Strong exposure in source to Confidential data flows and Data models for various Data quality projects.
- Involved in daily status call wif onsite Project Managers, DQ developers to update teh test status and defects.
- Strong noledge in Databases, Data ware house concepts, ETL process and Business Intelligence.
Environment: Informatica MDM 10.1/10.2, Informatica Data Director 10.1/10.2, Python, Informatica ActiveVos 188.8.131.52/4.2 , Informatica Power Center 10.1, Jboss 6.4EAP, RHEL 7, MS SQL Server.
Confidential - Minneapolis, MN
Senior Informatica Developer
- Translated teh business processes/SAS code into Informatica mappings for building teh data mart.
- Used Informatica Power Center to load data from different sources like flat files and Oracle, Teradata into teh Oracle Data Warehouse.
- Implemented pushdown, pipeline partition, persistence cache for better performance.
- Applied Business rules dat identify teh relationships among teh data using Informatica Data Quality (IDQ 8.6 8.6).
- Modified existing Informatica Data Quality (IDQ 8.6 8.6) Workflows to integrate teh business rules to certify teh quality of teh Data.
- Defined measurable metrics and required attributes for teh subject area to support a robust and successful deployment of teh existing Informatica MDM 9.5 platform.
- Planed Informatica MDM 9.5 requirement analysis sessions wif business users.
- Created Informatica MDM 9.5 Hub Console Mappings.
- Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions).
- Data Modeling using Star Schema and Snowflake Schema. Strong in Source to Confidential data Mapping and CDC (Change Data Capture) using Slowly Changing Dimension Mapping, incremental
- Hands On experience creating, converting oracle scripts (SQL, PL/SQL) to TERADATA scripts.
- Configure rules for power center operations team, no file monitoring, process not started, reject records and long running jobs.
- Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in teh Informatica Designer.
- Perform POCs on latest products and technologies dat can be used in teh Enterprise Business Intelligence Area.
- Extensively worked in creation of NoSQL data models, data loads wif bloom filters and TTL columns in column families.
- Used of Mongo DB as a open source software avoids teh traditional table-based relational database structure in favor of JSON-like documents wif dynamic schemas (MongoDB calls teh format BSON)
- Worked wif Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
- Performed teh data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ).
- Assisted teh QC team in carrying out its QC process of testing teh ETL components.
- Created pre-session and post-session shell scripts and email notifications.
- Involved in complete cycle from Extraction, Transformation and Loading of data using Informatica best practices.
- Created mappings using Data Services to load data into SAP HANA.
- Involved in Data Quality checks by interacting wif teh business analysts.
- Performing Unit Testing and tuned teh mappings for teh better performance.
- Maintained documentation of ETL processes to support noledge transfer to other team members.
- Created various UNIX Shell Scripts for scheduling various data cleansing scripts and automated execution of workflows.
- Responsible for requirement definition and analysis in support of Data Warehousing efforts.
- Extensive experience wif relational databases Oracle 10g/11g, DB2, SQL Server, Teradata, Greenplum, Amazon AWS Redshift
- Sourced data form RDS and AWS S3 bucket and populated in Teradata Confidential .
- Used Source Analyzer and Warehouse designer to import teh source and Confidential database schemas, and teh Mapping Designer to map teh sources to teh Confidential .
- Developed data Mappings between source systems and Confidential system using Mapping Designer.
- Developed shared folder architecture wif reusable Mapplets and Transformations.
- Extensively worked wif teh Debugger for handling teh data errors in teh mapping designer.
- Created events and various tasks in teh work flows using workflow manager.
- Responsible for tuning ETL procedures to optimize load and query Performance.
- Setting up Batches and sessions to schedule teh loads at required frequency using Informatica workflow manager and external scheduler.
- Used teh Aggregator transformation to load teh summarized data for Sales and Finance departments.
- Extensive Data modeling experience using Dimensional Data modeling, Star Schema modeling, Snowflake modeling, and FACT and Dimensions tables.
- Tested teh mapplets and mappings as per Quality and Analysis standards before moving to production environment.
- Taken part of Informatica administration. Migrated development mappings as well as hot fixes them in production environment.
- Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
- Created web service jobs by configuring WSDL in designer and used Informatica Web Services Hub to start teh Informatica tasks.
- Trouble issues in TEST and PROD. Do impact analysis and fix teh issues.
- Worked closely wif business analysts and gatheird functional requirements. Designed technical design documents for ETL process.
- Developed Unit test cases and Unit test plans to verify teh data loading process and Used UNIX scripts for automating processes.
- Involved as a part of Production support.
ENVIRONMENT: Informatica Power Center 9.1, Informatica MDM 9.5, Informatica IDQ 8.6 8.6, Power Exchange, Teradata, Data Quality Oracle11g, MS Access, Oracle weblogic 10.3.2, UNIX Shell Scripts, Windows NT/2000/XP, SQL Server 2008, SSIS, OBIEE, Qlikview, Linux, Teradata, SQL Assistant, Netezza, DB2.
CareFirst BCBS -Maryland
Sr Informatica Developer
CareFirst is a not-for-profit, non-stock parent company of Carefirst of Maryland Inc and Group Hospitalization and Medical Services Inc (these do business as Carefirst BlueCross BlueShield).
Teh project will take into account teh following processes:
Reinstatement for voided Members: If a member pays teh premium wifin teh cut off time tan voided member will be reinstated and will generate a report.
Reinstatement for Termed Members: If a member pays teh premium wifin teh cut off time tan termed member will be reinstated and will generate a report.
- Involved in analyzing scope of application, defining relationship wifin & between groups of data, star schema, etc.
- Analysis of star schema in dimensional modeling and Identifying suitable dimensions and facts for schema
- Involved in teh Design and development of Data Mart and populating teh data from different data sources using Informatica.
- Participated in development of Reports using Informatica.
- Parsing high-level design spec to simple ETL coding and mapping standards.
- Created and reviewed Informatica Mappings to load data using transformations like Source Qualifier, Sorter, Aggregator, Expression, Joiner, lookups, Stored Procedures and creating PL/SQL procedures, functions, Filters, Sequence, Router, Union and Update Strategy adhering to teh Time Warner coding standards
- Enable Agile Business Intelligence (BI) wif data virtualization.
- Extensively used Oracle, Netezza, Flatfile, XML file, DB2 data as source and Confidential .
- Wrote multiple programs in Python to monitor virtual machine usage data using VMWare API calls.
- Wrote Stored Procedures using PL / SQL for dropping indexes and again creating.
- Created and executed teh test cases for Informatica mappings and UNIX scripts.
- Designed, developed, tested, and maintained Tableau functional reports based on user requirements.
- Regularly interact wif Business Intelligence leadership on project work status, priority setting and resource allocations.
- Working wif Security Team, Siteminder, Web Hosting and Weblogic teams to implement SSO.
- Has worked wif teh JMS integration wif Informatica
- Worked extensively wif Advance analytics like Reference Lines and Bands, Trend Lines.
- Expertise in working wif data building Groups, Hierarchies and sets.
- Mastered in different Formatting techniques, using Annotations and Mark labels.
- Developed effective and interactive Dashboards using Parameters and Actions
Environment: Oracle, Informatica Power Center, Power Exchange, Business Intelligence Development Studio, Netezza, Unix Shell Script, tableau 9.5, Putty, DB2, Mainframe COBOL, SQL PLUS,SQL-Loader.
Dunkin Brands Inc., MA
Sr. Informatica Developer
Dunkin donuts is teh largest coffee and baked goods restaurant in teh world wif loyal customers in 31 countries and Baskin Robbins is teh largest and one of teh biggest ice cream specialty stores. Data is extracted from various sources cleansed and stored in Data warehouse using ETL tool and using BI tool reporting is done in teh data warehouse.
- GSS (Guest satisfaction Survey)
- ROR (Restaurant Operations)
- FPS (Franchisee Profitability System)
GSS (Guest Satisfaction Survey): Teh scope of this project is basically to analyze teh satisfaction of customers wif respect to product quality, cleanliness, guest service etc and also to analyze teh sales of a product, time window at which teh sales are high. Survey data dat a person takes online about a dunkin store is loaded into Data warehouse which is useful in analyzing teh stores overall performance from a customer’s perspective.
ROR (Restaurant Operations): Teh scope of this project is to load teh audit information dat happen at dunkin stores into Data warehouse. It halps in analyzing completely about a store maintenance, standards, hospitality etc. All teh data is hierarchical and is loaded in to teh Enterprise data warehouse as per teh business requirements which halp in analyzing teh store performance from a business perspective.
FPS (Franchisee Profitability System): Teh scope of this project is to analyze teh financial aspects of a store wif regards to profit and loss which decides teh revenue generated by a store. Point Of Sales, Gross margin, Cost of sales etc are calculated on a daily basis for every store which gives a clear understanding about teh performance of teh store.
- Involved in gathering, analyzing and documenting business requirements and functional requirements and data specifications from users and transformed them in to technical specifications.
- Extracted data from various sources like flat files, XML files, Oracle and loaded into Enterprise data ware house.
- Worked on Informatica 9.1 client tools like Source Analyzer, Warehouse Designer, Mapping Designer, Workflow Manager and Workflow Monitor.
- Based on teh requirements, used various transformations like Source Qualifier, Expression, Filter, Router, Update strategy, Sorter, Lookup, Aggregator and Joiner in teh mapping.
- Created complex mappings using teh Mapping designer, respective workflows and worklets using teh Workflow manager.
- Troubleshooted teh mappings using teh Debugger and improved teh data loading efficiency using Sql-overrides and Look-up Sql overrides.
- Used teh Versioning control in order to Track teh changes.
- Developed SCD I and SCD type II mappings.
- Implemented incremental loads, Change Data capture and Incremental Aggregation
- Used bulk load utility to load bulk data to teh database.
- Identified performance bottlenecks and Involved in performance tuning of sources, targets, mappings, transformations and sessions to optimize session performance.
- Created UNIX Shell scripts and called as pre session and post session commands.
- Used SQL tools like TOAD to run SQL queries and validate teh data in warehouse.
- Responsible for moving teh mappings and sessions from development repository to production repository and provided 24/7 production support.
- Developed Unit test plans for every mapping developed and executed teh test plans.
Environment: Informatica Power Center 8.6.1, Oracle 10g, Toad for oracle, Flat Files, XML Files, Erwin 7.3, MS Visio, Windows 2000, UNIX AIX, Shell Scripting, IDQ, PL/SQL, SQL, OBIEE, Appworx Scheduling Tool. iGATE Global Solutions - IN
IGATE Consulting's key areas of operations are in consulting services, business strategy and solution implementation targeted at communication companies and their suppliers, as well as regulatory authorities. IGATE Telecoms Consulting is also qualified to address multiple business area challenges such as application audit and strategy development, customer care and CRM, order management, service provisioning and fulfillment, IT and network architecture definition, network management and mediation, billing, interconnect, and project and program management.
- Extensively involved in Gathering requirements by holding meetings wif users.
- Constructed context diagrams and data-flow diagrams based on a description of a business process. Analyzing teh data model and identification of heterogeneous data sources.
- Constructed an extended entity relationship diagram based on a narrative description of a business scenario.
- Created teh Source and Confidential Definitions using Informatica Power Center Designer.
- Used Informatica Power center to load teh data into data warehouse.
- Development of Informatica mappings and Mapplets and also tuned them for Optimum performance and dependencies.
- Created reusable Transformations for modifying data before loading into Confidential tables.
- Created mapplets in teh Informatica Designer which are generalized and useful to any number of mappings for ETL jobs.
- Created Transformations using teh SQL script to modify teh data before loading into tables.
- Created and used mapping parameters, mapping variables using Informatica mapping designer to simplify teh mappings.
- Used teh Business objects features Slice and Dice and Drill Down for multi-dimensional analysis.
- Scheduled various daily and monthly ETL loads using Control-M.
- Inserted Objects, Conditions, classes, subclasses and user objects according to client's requirement.
- Prepared SQL Queries to validate teh data in both source and Confidential databases.
- Managed teh database objects Indexes, Triggers, procedures, functions, packages, cursors.
- Worked on TOAD and Oracle SQL Developer to develop queries and create procedures and packages in Oracle.
- Worked along wif UNIX team for writing UNIX shell scripts to customize teh server scheduling jobs.
Environment: Informatica Power Center 8.1/8.6, Oracle 9i, SQL Server, TOAD, Control M, Windows 2003 and UNIX.
Dr Reddy's Laboratories - Hyderabad, ANDHRA PRADESH, IN
Dr Reddy's Laboratories discovers, develops, manufactures, and markets leading prescription medicines for humans and animals and many of teh world's best-non consumer brands. Teh warehouse is loaded wif teh wholesalers' inventory activity and sales activity data and teh different sales force data is loaded into Dimensions. Data warehouse design is based on a Star Schema. Teh Source system is IBM Mainframe, Oracle and teh Data Warehouse is in ORACLE 8i on UNIX. Informatica Power Center 6.1 is used to Extract, Transform and Load teh data into teh Warehouse.
- Created teh mapping documents based on teh data model and teh client requirements.
- Developed Informatica Mappings as per teh documents.
- Responsible for testing and migration of ETL maps from teh development to production environment.
- Worked on performance tuning.
- Prepared test cases and test plans.
- Loading teh data received from teh wholesalers and teh distributors, which are lying in teh Unix box into teh data warehouse on daily and weekly basis.
- Involved in analysis of data in various modules of teh project.
- Worked wif teh data-modeling tool Erwin.
- Loaded teh Dimension tables wif teh various Sales force data at various levels.
- Worked wif MS-Excel spreadsheet and MS-Word for Documentation.
Environment: Informatica Powercenter 6.2, Cognos 5.x/6.1, DB2, Oracle 8i, UNIX (Solaris), Windows NT.
Bachelor of Computer Science Engineering JNTU
References will be available upon request
- .NET Developers/Architects Resumes
- Java Developers/Architects Resumes
- Informatica Developers/Architects Resumes
- Business Analyst (BA) Resumes
- Quality Assurance (QA) Resumes
- Network and Systems Administrators Resumes
- Help Desk and Support specialists Resumes
- Oracle Developers Resumes
- SAP Resumes
- Web Developer Resumes
- Datawarehousing, ETL, Informatica Resumes
- Business Intelligence, Business Object Resumes
- MainFrame Resumes
- Network Admin Resumes
- Oracle Resumes
- ORACLE DBA Resumes
- Other Resumes
- Peoplesoft Resumes
- Project Manager Resumes
- Quality Assurance Resumes
- Recruiter Resumes
- SAS Resumes
- Sharepoint Resumes
- SQL Developers Resumes
- Technical Writers Resumes
- WebSphere Resumes
- Hot Resumes