We provide IT Staff Augmentation Services!

Sr. Mdm Developer/wherescape Red Developer/profisee Developer Resume

2.00/5 (Submit Your Rating)

Nashville, TN

SUMMARY:

  • Over 8+ years of progressive hands - on experience in analysis, ETL processes, design and development of enterprise level data warehouse architectures, enterprise level large databases, designing, coding, testing, and integrating ETL.
  • Proficient in understanding and detailing the business requirements and had experience in interacting with business users to clarify requirements and translate the requirements into technical and user stories.
  • Experience in Dimensional data modelling techniques, Slow Changing Dimensions (SCD), Software Development Life Cycle (SDLC) (Requirement Analysis, Design, Development & Testing), and Data warehouse concepts - Star Schema/Snowflake Modelling, FACT & Dimensions tables, Physical & Logical Data Modelling.
  • Experience in integration of various data sources like Oracle 12c,11g,10g, MS SQL Server, XML files, Teradata, Netezza, DB2, Flat files, Salesforce, Mainframe sources into staging area through ETL Process and loaded data into different target Enterprise level databases.
  • Expertise in Master Data Management concepts, Methodologies, and ability to apply dis noledge in building MDM solutions
  • Experience in slowly changing dimensions (SCD1&SCD2) by using mappings .
  • Experience in installation and configuration of core InformaticaMDM Hub components such as Hub Console, Hub Store, Hub Server, Cleanse Match Server, and Cleanse Adapter in Windows.
  • Expertise in developing large integrated complex mappings with standard transformations and complex transformations like SQL transformation, Java Transformation for calling API’s, Store-procedures, Flexible Target Key transformation (AI, BI modes) also developed complex re-usable chunk of mappings called Mapplets, and Worklets and re-usable and non-reusable transformations.
  • Extensively worked on ETL Automation Testing using Ruby,done unit testing and acceptance testing through Automation and captured major and minor bugs during code development and delivered highly efficient Business deliverables.
  • Experience in real time data processing usingContinuous flow, publish data toAb Initioqueue,JMS Queue,where Ab Initiobatch job used to interact withmiddlewareMDM and ESB real-time Servers.
  • Hands on Experience in SQL, DB2 SQL, TSQL, PL/SQL code and done Perl scripting to validate the record counts in the xml’s, getting load event id to process, updating the status of events for the load event id and building informaticaparameter file and executing the informatica workflows using Perl scripts.
  • AlsodevelopedUNIX shell scripts to run batch jobs and Informatica workflows from Unix server and automatedtheentireETL test ruby specs through Jenkins dat is integrated with the docker, so units and acceptance will run on daily basis.
  • Experience in Stored ProcedureandPL/SQLqueries for extract and update records in Oracle DB using Ab Initio components.
  • Having Good experience in Informatica10.X development.
  • ExperienceinL2/L3Supportfor enhancement,troubleshooting,Production issue analysiswithintime bound SLA usingAbInitio, Oracle,Teradata,Pl/SQL,and Unix.
  • Extensively worked on InformaticaData Quality (IDQ) projects and worked on InformaticaData Quality transformations like Address validator, Parser, Labeller, Match, Exception, Association, Standardizer and other significant transformations.As a Data Quality developer initiated the process of Data profiling by profiling different formats of data from different sources.
  • Expertise in maintaining the different Versions of the code in GitHuband expertise in maintaining Project and Feature branches and hands on experience in deploying the code from Development server to Test Region or to Production server using Urban Code Deployment tool (UCD).
  • Strong skills in SQL, PL/SQL packages, functions, stored procedures, triggers and materialized views to implement business logic in oracle database.
  • Designed complex Mappings and enhanced the performance by identifying the bottle necks in Performance tuning at target level, source level, mapping and session level and worked with dba to identify the bottlenecks during the transaction commits and logs.
  • Maintained best practice of writing tests first and developing code next a Test-Driven Development methodology TDD which is associated in writing Good Failing Tests first, Developing Code next, then Passing already written Failed Tests and finally Refactoring the code.
  • Hands on experience working in UNIX, LINUX and Windows environments.
  • With both On-site and Off-shore experience has developed skills in system analysis, troubleshooting, debugging, deployment, Team management, prioritizing tasks and customer handling.
  • Domainnoledge experience on Health, Finance, Insurance, Telecom Domains.

TECHNICAL SKILLS:

Operating System: UNIX, Linux, Windows

Programming and Scripting: C, C++, Java, .Net, Perl Scripting, Shell Scripting, XSLT, PL/SQL, T-SQL.

Specialist Applications & Software: Informatica Power Center/10.1 10/9.6.1/9.5/9.1/8.6/8.1/7.1, Informatica Power Exchange, Metadata Manager, Informatica Data Quality (IDQ), Informatica Data Explorer (IDE), MDM, SSIS, Salesforce, DataStage, wherescapeRed/8.4.2.0,Profisee /8.0.2etc.

Data Modeling (working noledge): Relational Modeling, Dimensional Modeling (Star Schema, Snow-Flake, FACT, Dimensions), Physical, Logical Data Modeling, and ER Diagrams.

Databases tools: SQL Server Management Studio (2008), Oracle SQL Developer (3.0), Toad 11.6 (Oracle), Teradata, AQT v9 (Advanced query tool) (Oracle/Netezza), DB Artisan 9.0 (Sybase), SQL Browser (Oracle Sybase), Visio, ERWIN

Scheduling tools: Informatica Scheduler, CA Scheduler(Autosys), ESP, Maestro, Control-M.

Conversion/Transformation tools: Informatica Data Transformation, Oxygen XML Editor (ver.9, ver.10)

Software Development Methodology: Agile, Water fall.

PROFESSIONAL EXPERIENCE

Confidential

Sr. MDM Developer/WhereScape Red Developer/Profisee Developer

RESPONSBILITIES:

  • Where scape Red development and deployment Activity, development using the right pane, middle pane, left pane and create the landing tables and p-landing tables and stage to P-stage development.
  • Using the Scheduler schedule the job to run daily, weekly and Monthly Basis.
  • schedule the micro bath and run the loads every hour to get the updated data.
  • Using the tool Profisee we did the MDM development by matching and merging to get the golden Version of Data.
  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.
  • Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using Informatica MDM Hub console.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Created Stored Procedures for data transformation purpose.
  • Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management
  • Created profile and scorecards to review data quality.
  • Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in to MDM landing tables.
  • Involved in doing Unit Testing, Integration Testing, and Data Validation.
  • Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.
  • Developed Unix script to sftp, archive, cleanse and process many flat files
  • Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager
  • Created Mapping parameters and Variables and written parameter files.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
  • Worked with the SCM code management tool to move the code to Production

ENVIRONMENT: wherescapeRed 8.4.2, PROFISEE8.0.2, UNIX, SQL,, IDE,Ab, CDC, MDM, Linux, Perl, Shell, PL/SQL, Netezza, Teradata, Microsoft SQL Server 15.0, and Microsoft Visual studio

Confidential, Nashville, TN

Sr. Informatica Developer/Oracle/Application Developer

RESPONSBILITIES:

  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Involved in extracting the data from the Flat Files and Relational databases into staging area.
  • Created Stored Procedures for data transformation purpose.
  • Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management
  • Worked on Informatica Power Center 9 x tools - Source Analyzer, Data warehousing designer, Mapping, Designer, Mapplet& Transformations.
  • Extensively used ETL to load data using Power Center / Power Exchange from source systems like Flat Files and Excel Files into staging tables and load the data into the target database Oracle. Analyzed the existing systems and made a Feasibility Study.
  • Used Informatica power center and Data quality transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and Normalizer, Standardizer, Labeler, Parser, Address Validator (Address Doctor), Match, Merge, Consolidation transformations to extract, transform, cleanse, and load the data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.
  • Created and configured workflows, Worklets& Sessions to transport the data to target warehouse Oracle tables using Informatica Workflow Manager.
  • Worked on Database migration from Teradata legacy system to Netezza and Hadoop.
  • Worked in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
  • Extracted data from oracle database and spreadsheets, CSV files and staged into a single place and applied business logic to load them in the central oracle database.
  • Developed Oracle PL/SQL Packages, Procedures, Functions and Database Triggers.
  • Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database.
  • Built a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Created profile and scorecards to review data quality.
  • Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in to MDM landing tables.
  • Involved in doing Unit Testing, Integration Testing, and Data Validation.
  • Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.
  • Developed Unix script to sftp, archive, cleanse and process many flat files
  • Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager
  • Created Mapping parameters and Variables and written parameter files.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Worked on Scheduling Jobs and monitoring them through Control M and CA scheduler tool (Autosys).
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
  • Worked with the SCM code management tool to move the code to Production

ENVIRONMENT: Informatica Power Center 10.1, UNIX, SQL, IDQ, IDE,AbInitio, CDC, MDM, Linux, Perl, WINSCP, Shell, PL/SQL, Netezza, Teradata, Microsoft SQL Server 2008, and Microsoft Visual studio

Confidential, BOSTAN, MA

Sr. Informatica Developer/Application Developer

RESPONSIBILITIES:

  • Involved in gathering and analysis business requirement and writing requirement specification documents and identified data sources and targets.
  • Communicated with business customers to discuss the issues and requirements.
  • Designed, documented and configured the Informatica MDM Hub to support loading, cleansing of data.
  • Involved in implementing the Land Process of loading the customer/product Data Set into Informatica MDM from various source systems.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Developed several IDQ complex mappings in Informatica a variety of Power Center, transformations, Mapping Parameters, Mapping Variables, Mapplets& Parameter files in Mapping Designer using Informatica Power Center.
  • Profiled the data usingInformatica Data Explorer(IDE) and performed Proof of Concept forInformatica Data Quality(IDQ).
  • Develop data processingapplications (graphs, plans, xfer’s etc.)usingvarious components ofAb Initio.
  • Used InformaticaPower Center to load data from different data sources like xml, flat files and Oracle, Teradata, Salesforce.
  • Imported the IDQ address standardized mappings into Informatica Designer as a Mapplets.
  • Utilized of Informatica IDQ to complete initial data profiling and matching/removing duplicate data
  • Used relational SQL wherever possible to minimize the data transfer over the network.
  • Identified and validated the Critical Data Elements in IDQ.
  • Built several reusable components on IDQ using Standardizers and Reference tables which can be applied directly to standardize and enrich Address information.
  • Involved in writing Unixshell scripts for Informatics ETL tool to run the Sessions.
  • Effectively worked in Informatica version-based environment and used deployment groups to migrate the objects.
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Extensively worked on Labeler, Parser, Key Generator, Match, Merge, and Consolidation transformations to identify the duplicate records.
  • Worked on CDC (Change Data Capture) to implement SCD (Slowly Changing Dimensions) Type 1 and Type 2
  • Extensively used various active and passive transformations like Filter Transformation, Router Transformation, Expression Transformation, Source Qualifier Transformation, Joiner Transformation, and Look up Transformation, Update Strategy Transformation, Sequence Generator Transformation, and Aggregator Transformation.
  • Used the Teradatafast load utilities to load data into tables
  • Used SQL tools like TOAD to run SQL queries and validate the data.
  • Converted all the jobs scheduled in Maestro toAutosys scheduler as the per requirements
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs (work flows)
  • Used Linux scripts and necessary Test Plans to ensure the successful execution of the data loading process
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads, and process flow of the mappings
  • Provided 24x7 production supports for business users and documented problems and solutions for running the workflows.

ENVIRONMENT: Informatica Power Center 10.1, UNIX, SQL, IDQ, IDE,AbInitio, CDC, MDM, Linux, Perl, WINSCP, Shell, PL/SQL, Netezza, Teradata, Microsoft SQL Server 2008, and Microsoft Visual studio

Confidential, OR

Sr. ETLInformatica Developer

RESPONSIBILITIES:

  • Involved in all phases of SDLC from requirement gathering, design, development, testing, Production, and support for production environment.
  • Defined, configured, and optimized various MDM processes including landing, staging, base objects, foreign-key relationships, lookups, query groups, queries/custom queries, cleanse functions, batch groups and packages using InformaticaMDM Hub console.
  • Used Informatica Power Center for (ETL) extraction, transformation and loading data from heterogeneous source systems into target database.
  • Worked inStored ProcedureandPL/SQLqueries for extract and update records in Oracle DB using Ab Initiocomponents.
  • Created Stored Procedures for data transformation purpose.
  • Involved in the Dimensional Data Modeling and populating the business rules using mappings into the Repository for Data management
  • Worked on Informatica Power Center 9 x tools - Source Analyzer, Data warehousing designer, Mapping, Designer, Mapplet& Transformations.
  • Involved inPerformance tuningof Ab Initio graphs usingICFF, Multifile, compressed flow, partitioning on keys.
  • Used Informatica power center and Data quality transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer, Standardizer, Labeler, Parser, Address Validator (Address Doctor), Match, Merge, Consolidation transformations to extract, transform, cleanse, and load the data from different sources into DB2, Oracle, Teradata, Netezza and SQL Server targets.
  • Worked on Database migration from Teradata legacy system to Netezza and Hadoop.
  • Worked in building Data Integration and Workflow Solutions and Extract, Transform, and Load (ETL) solutions for data warehousing using SQL Server Integration Service (SSIS).
  • Used Teradata Utilities (BTEQ, Multi-Load, and Fast-Load) to maintain the database.
  • Built a re-usable staging area in Teradata for loading data from multiple source systems using template tables for profiling and cleansing in IDQ
  • Created profile and scorecards to review data quality.
  • Actively involved in Data validations and unit testing to make sure data is clean and standardized before loading in to MDM landing tables.
  • Generated PL/SQL and Shell scripts for scheduling periodic load processes.
  • Designed and developed UNIX scripts for creating, dropping tables which are used for scheduling the jobs.
  • Invoked Informatica using "pmcmd" utility from the UNIX script.
  • Wrote pre-session shell scripts to check session mode (enable/disable) before running/ schedule batches.
  • Involved in supporting 24*7 rotation system and strong grip in using scheduling tool Tivoli.

ENVIRONMENT: Informatica Power Center 9.6.1, UNIX, Oracle, Ab Initio, Linux, Perl, Shell, MDM, IDQ, PL/SQL, Tivoli, Oracle 11g/10g, Teradata 14.0.

Confidential, MN

Sr. ETL Informatica Developer

RESPONSIBILITIES:

  • Worked with Business Analysts (BA) to analyze the Data Quality issue and find the root cause for the problem with the proper solution to fix the issue...
  • Document the process dat resolves the issue which involves analysis, design, construction and testing for Data quality issues
  • Involved in doing the Data model changes and other changes in the Transformation logic in the existing Mappings according to the Business requirements for the Incremental Fixes
  • Worked extensively with Informatica tools like Source Analyzer, Warehouse Designer, Transformation Developer, and Mapping Designer.
  • Extensively used Informatica Data Explorer (IDE)&Informatica Data Quality (IDQ) profiling capabilities to profile various sources, generate score cards, create and validate rules and provided data for business analysts for creating the rules.
  • Created Informatica workflows and IDQ mappings for - Batch and Real Time.
  • Extracted data from various heterogeneous sources like DB2, Sales force, Mainframes, Teradata, and Flat Files using Informatica Power center and loaded data in target database.
  • Created and Configured Landing Tables, Staging Tables, Base Objects, Foreign key relationships, Queries, Query Groups etc. in MDM.
  • Defined the Match and Merge Rules in MDM Hub by creating Match Strategy, Match columns and rules for maintaining Data Quality
  • Used connected and Unconnected Lookup transformations and Lookup Caches in looking the data from relational and Flat Files. Used Update Strategy transformation extensively with DD INSERT, DD UPDATE, DD REJECT, and DD DELETE.
  • Extensively Implemented SCD TYPE 2 Mappings for CDC (Change data capture) in EDW.
  • Involved in doing Unit Testing, Integration Testing, and Data Validation.
  • Migration of Informatica maps extensively involved in migration from Dev to SIT and UAT to Production environment.
  • Developed Unix script to sftp, archive, cleanse and process many flat files
  • Created and ran Pre-existing and debug sessions in the Debugger to monitor and test the sessions prior to their normal run in the Workflow Manager
  • Extensively worked in migrating the mappings, worklets and workflows within the repository from one folder to another folder as well as among the different repositories.
  • Created Mapping parameters and Variables and written parameter files.
  • Used SQL queries and database programming using PL/SQL (writing Packages, Stored Procedures/Functions, and Database Triggers).
  • Worked on Scheduling Jobs and monitoring them through Control M and CA scheduler tool(Autosys).
  • Used SQL tools like TOAD to run SQL queries and validate the data in warehouse.
  • Worked with the SCM code management tool to move the code to Production

ENVIRONMENT: Informatica Power Center 9.0.1, Erwin, Teradata, Tidal, SQL Assistance, DB2, XML, Oracle 9i/10g/11g, MQ Series, OBIEE 10.1.3.2, IDQ, MDM Toad and UNIX Shell Scripts.

Confidential, Pittsburgh, PA

Sr. Informatica Developer/IDQ/MDM

RESPONSIBILITIES:

  • Designing the dimensional model and data load process using SCD Type 2 for the quarterly membership reporting purposes.
  • Derived the dimensions and facts for the given data and loaded them on a regular interval as per the business requirement.
  • Generating the data feeds from analytical warehouse using required ETL logic to handle data transformations and business constraints while loading from source to target layout.
  • Worked on Master Data Management (MDM), Hub Development, extract, transform, cleansing, loading the data onto the staging and base object tables
  • Extracted data from multiple sources such as Oracle, XML, and Flat Files and loaded the transformed data into targets in Oracle, Flat Files.
  • Wrote Shell Scripts for Data loading and DDL Scripts.
  • Worked with Informatica Data Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ
  • Designing and coding the automated balancing process for the feeds dat goes out from data warehouse.
  • Implement the automated balancing and control process which will enable the control on the audit and balance and control for the ETL code.
  • Hands on experience on HIPPA Transactions like 270, 271, 272, 273, 274, 275, 276, 277, 834, 835, 837 etc.
  • Improving the database access performance by tuning the DB access methods like creating partitions, using SQL hints, and using proper indexes.
  • All the jobs are integrated using complex Mappings including Mapplets and Workflows using Informatica power center designer and workflow manager.
  • Worked on data cleansing and standardization using the cleanse functions in Informatica MDM.
  • Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
  • Mapped client processes/databases/data sources/reporting software to HPE’s XIX X12 processing systems(BizTalk/Visual Studio/Oracle SQL/MS SQL/C#/.Net/WSDL/SOAP/Rest/API/XML/XSLT).
  • Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations, and create mapplets dat provides reusability in mappings.
  • Analyzing the impact and required changes to incorporate the standards in the existing data warehousing design.
  • Following the PDLC process to move the code across the environments though proper approvals and source control environments.
  • Source control using SCM.

ENVIRONMENT: Informatica Power Center 9.0.1, Erwin 7.2/4.5, Business Objects XI, Unix Shell Scripting, XML, Oracle 11g/10g, DB2 8.0, IDQ, MDM, TOAD, MS Excel, Flat Files, SQL Server 2008/2005, PL/SQL, Windows NT 4.0

Confidential

Informatica Developer

RESPONSIBILITIES:

  • Involved in analysis, design, development, test data preparation, unit and integration testing, Preparation of Test cases and Test Results
  • Coordinating with client, Business and ETL team on development
  • Developed Batch jobs using extraction programs using COBOL, JCL, VSAM, Datasets, FTP to Load Informatica tables
  • Involved in full project life cycle - from analysis to production implementation and support with emphasis on identifying the source and source data validation, developing logic, and transformation as per the requirement and creating mappings and loading the data into BI database.
  • Based on the business requirements created Functional design documents and Technical design specification documents for ETL Process.
  • Designing and developing ETL solutions in Informatica Power Center 8.1
  • Designing ETL process and creation of ETL design and system design documents.
  • Developing code to extract, transform, and load (ETL) data from inbound flat files and various databases into various outbound files using complex business logic.
  • Most of the transformations were used like Source Qualifier, Aggregator, Filter, Expression, and Unconnected and connected Lookups, Update Strategy.
  • Created automated shell scripts to transfer files among servers using FTP, SFTP protocols and download files.
  • Developed Informatica mappings, enabling the ETL process for large volumes of data into target tables.
  • Designed and developed process to handle high volumes of data and high volumes of data loading in each SLA.
  • Created Workflow, Worklets and Tasks to schedule the loads at required frequency using Informatica scheduling tool.
  • Ensured acceptable performance of the data warehouse processes by monitoring, researching and identifying the root causes of bottlenecks.

ENVIRONMENT: Informatica Power Centre 8.1, ETL, Business Objects, Oracle 10g/9i/8i, HP - Unix, PL/SQL.

We'd love your feedback!