We provide IT Staff Augmentation Services!

Etl Developer Resume

5.00/5 (Submit Your Rating)

Germantown, MD

SUMMARY:

  • Over 9+ years of extensive experience in complete Software development life cycle (SDLC) with a strong background in data ETL, development and implementation of various data warehouses/data marts.
  • Extensive experience as ETL Developer in technologies and methods using Informatica Power Center 9.6.1,9.5, 8.x, 7.x.
  • Worked with Informatica Data Quality 9.6.1/10.1 (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion.
  • Good Understanding knowledge of Ralph Kimball & Bill Inman Methodologies.
  • Extensively worked on Dimensional modeling (Star/Snow flake), Data migration, Slowly Changing Dimensions, Data cleansing and Data Staging of operational sources using ETL processes and providing data mining features for data warehouses.
  • Experience in integration of various Operational Data Source (ODS) with Multiple Relational Databases like Oracle, SQL Server, and DB2 and Worked on integrating data from flat files like fixed width and delimited.
  • Actively involved in Performance SQL Tuning, ETL tuning and Error handling.
  • Experience in implementing the complex business rules by creating transformations, re - usable transformations (Expression, Aggregator, Filter, Connected and Unconnected Lookup's, Router, Rank, Joiner, Update Strategy), and developing complex Mapplets and Mappings.
  • Experience in Data Modeling (Logical and Physical Design for distributed databases, Reverse-engineering and Forward-engineering using Erwin).
  • Involved in the designing and building of Universes using Business Objects.
  • Excellent knowledge of studying the data dependencies using Metadata stored in the Informatica Repository and preparing batches for the existing sessions for scheduling of multiple sessions.
  • Proficient in understanding business processes/requirements and translating them into technical requirements.
  • Experience in building Oracle Data Warehouse from different OLTP systems.
  • Strong experience with MS SQL, PL/SQL, T-SQL, Stored Procedures, Packages, and Triggers.
  • Experience with Teradata as the target for the data marts and worked with BTEQ, Fast Load, and Multi Load.
  • Extensive experience with Change Data Capture (CDC).
  • Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data Cleansing, Data Validation
  • Experience in shell scripting to run ETL mappings and file handling.
  • Experience with various Databases includes Oracle 11g/10g, IBM DB2, SQL Server 2005/2008.
  • Developed Test Scripts, Test Cases, and SQL QA scripts to perform Unit testing, System Testing and Load testing.
  • Seamlessly migrated the Code from Development to Testing, UAT and Production.
  • Experience with various domain areas such as Health Care, Pharmaceuticals, Finance, Supply Chain and Manufacturing.
  • Expertise over preparing report specifications, database designs to support the reporting requirements
  • Perform review of Session log files to trace causes of bottlenecks.
  • Heavily supported Teradata Production environments for different kinds of failures and fixes.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse, using different transformations like Source Qualifier, Java, Expression, Lookup, Aggregate, Update Strategy, Transaction control, Normalizer and Joiner.
  • Experience in resolving on-going maintenance issues and bug fixes; monitoring Informatica sessions as well as performance tuning of mappings and sessions.
  • Experience with Partitioning at Informatica level.
  • Experience in Oracle SQL query tuning
  • Prepared user manuals, presentations and trained user and operators
  • Strong trouble-shooting, problem solving, analytical and design skills.
  • Worked with teams of diverse size and composition
  • Experience in understanding Source systems using given ER Models
  • Exposure to Dimensional Data Models like Star and Snow Flake Schemas
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Knowledge about IDQ, Informatica MDM.
  • Experience in using Automation Scheduling tools like Maestro, Crontab, Autosys and Control-M.
  • Worked extensively with slowly changing dimensions (SCDs) and Change data capture (CDC).
  • Experience with supporting heavy Data Warehouse Environments.
  • Experience in Creating Design Documents, Run Books and Test Documents.
  • Expertise with release & change Management.
  • Implemented several DWH Projects with both Agile & Water Fall Methodologies. Experience withAgile tools Rally & Jira for task tracking.
  • Good communication skills with strong ability to interact with end-users, customers, and team members.
  • Well acquainted with Performance Tuning of Informatica Power Center Mappings by identifying bottlenecks
  • Proficient experience in different Databases like Oracle, SQL Server.
  • Expertise in entire life cycle of DWH with project scope, Analysis, requirements gathering, data modeling, ETL Design, development, Unit / System testing and production support.
  • Strong experience in Understanding Dimensional Modeling - Star and Snow Flake Schema, Identifying Facts and Dimensions.
  • Expertise in working with relational databases such as Oracle 12c/11g/10g, Teradata 15/14/12, SQL Server 2000/2005/2008, DB2, My Sql, Sybase as well with Flat Files & XML.
  • Extensive experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries and Oracle PL/SQL.

TECHNICAL SKILLS:

ETL Tools: Informatica Power Center 10.2/9.6.1/9.5.1/8.6.1/8.1.1

Modeling: Dimensional Data Modeling, Star Schema Modeling, Snow - flake Schema Modeling, Erwin, Microsoft Visio

RDBMS: Oracle 12c/11g/10g, Teradata 15/14/13/12, DB2, SQL Server 2000/2005/2008, MySQL, Sybase

QA Tools: Win Runner, QuickTestPro, Test Director, Quality Center

Reporting Tools: Cognos, Business Objects, Dashboard Reporting

Languages: Java, XML, UNIX Shell Scripting, SQL, PL/SQL

Operating Systems: Windows, Unix, Linux

PROFESSIONAL EXPERIENCE:

Confidential, Germantown, MD

ETL Developer

Responsibilities:

  • Worked on project for External Workers Compliance Reporting for different line of businesses.
  • Transformation of existing data & derived as per the needs of the Stake holders
  • Participated in team meeting with Business Analysts of other teams for requirement gathering and transforming it into technical requirements.
  • Worked on Claims Center, Billing Center, Policy center for extracting data from GuideWire systems.
  • Performed Unit Testing, System Testing and Regression testing and validated the data.
  • Involved in testing Pentaho mappings using the test data scenarios and upgraded the QA Test Plans.
  • Extensively involved in Fine tuning the Pentaho code (mappings and Sessions), Stored Procedures and SQL queries to obtain optimal performance and throughput
  • Defect Fixing & running daily workflows to monitor redone changes in all environments.
  • Creating build documentation used for migration of codes & mappings from one environment to another.
  • Performed Data validations across Sources, Operational Data Store and EDW.
  • Developed internal and external Interfaces to send the data in regular intervals to Data warehouse systems
  • Performed data migration in different sites on regular basis.
  • Developed complex SQL queries to develop the Interfaces to extract the data in regular intervals to meet the business requirements and extensively used Teradata Utilities like M - load, F- Load, TPT, BTEQ and Fast Export.
  • Performed Unit testing of developed components and promoted the code to testing environment.
  • Involved in the error checking and testing of ETL Procedures using Informatica Session log and workflow logs.
  • Coordinated with testing team for Bug fixes.
  • Prepared the implementation plan and coordinated with Release management team for production deployment.
  • Coordinated with Operations team for backfill requirements.
  • Monitored the scheduled jobs using Informatica Monitor.
  • Involved in performance tuning and optimization of Informatica mappings and sessions using features like partitions and data/index cache to manage very large volume of data.
  • Worked closely with different teams to make sure that the dependency of data is being met and ensured that there are no hurdles.
  • Extensively used Power Center to design multiple mappings with embedded business logic.
  • Created transformations like Expression, Lookup, Joiner, Rank, Update Strategy and Source Qualifier Transformation using the Informatica designer.
  • Created mapplet and used them in different mappings.
  • Extensively involved in performance tuning of the Informatica ETL mappings by using the caches and overriding the SQL queries and also by using Parameter files.
  • Worked on Flat Files and XML, DB2, Oracle as sources.
  • Written PL/SQL Procedures and functions.
  • Involved in change data capture (CDC) ETL process.
  • Implemented Slowly Changing Dimension Type I & Type II for different Dimensions.
  • Analyzed the data from various sources and integrating them to fit in WIP project model.
  • Analyzed the Functional Specs provided by the Business Analyst.
  • Developed ETL processes to extract data from source to staging and from staging to ODS
  • Developed ETL processes to load various Dimensions and Facts.
  • Developed Change detection capture (CDC) processes to identify new and changed loan related information.
  • Worked on slowly changing dimensions based on the business requirements.
  • Applied necessary transformation rules

Environment : Informatica Power Center 9.6, Informatica IDQ 9.1, Oracle 11g, SQLServer2008, IBM (DB2), MS Access, Windows XP, Toad, SQL developer.

Confidential

ETL Developer

Responsibilities:

  • Experience Working with Informatica Data Quality 10.1 transformations.
  • Experience doing source analysis by Profiling using IDQ business analyst & building score cards to monitor refined data phases.
  • Involved with Informatica team members in Designing, document and configure the Informatica IDQ suite to support loading, cleansing, matching, merging, and publication of DQM data.
  • Performed match/merge and ran match rules to check the effectiveness of DQM process on data.
  • Row profiling & column profiling by creating business rules on source tables using Informatica Business Analyst.
  • Having experience in development, maintenance, support and enhancement inInformatica Power Center, Teradata, Informatica Data Quality (IDQ), Informatica Big Data Management (BDM), Oracle, Hadoop Hortonworks, Informatica Data Validation Option (DVO).
  • Experience in implementing Data Quality rules on Hive Data sources using Informatica Big Data Edition 10.x
  • Experience using Sqoop and Teradata connectors (TDCH) to export data into Hadoopfrom Teradata.
  • Involved in creation of HLD and LLD document based on BR and got those reviewed by customers along with detailed project time lines.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Developed the mappings, applied rules and transformation logics as per the source and target system requirements.
  • Used Informatica as ETL tool, and stored procedures to pull data from source systems/ files, cleanse, transform and load data into the Teradata using Teradata Utilities.
  • Designing the ETLs and conducting review meets Involvement in implementation of BTEQ jobs.
  • Wrote appropriate code in the conversions as per the Business Logic using BTEQ, Fast Exports, Multi loads and Fast Load scripts.
  • Performed data quality analysis, gathered information to determine data sources, data targets, data definitions, data relationships, and documented business rules.
  • Worked with business analysts to identify appropriate sources for data warehouse and prepared the Business Release Documents, documented business rules, functional and technical designs, test cases and user guides.
  • Coding using BTEQ SQL of TERADATA, wrote UNIX scripts to validate, format and execute the SQL’s on UNIX environment.
  • Created mapping documents to outline data flow from sources to targets. Parsed high-level design specification to simple ETL coding and mapping standards.
  • Involved in performance tuning of the Informatica Power Center Mappings/Sessions.
  • Used Self-CAT for validating, promoting, importing and exporting jobs from development environment to testing environment.
  • Created ETL reusable processes for Error handling & Audit tracking.
  • Created test cases for Unit testing and System testing.
  • Developed Processes to cleanse the source data as per Business Supplied Cleansing rules.
  • Wrote complex SQL Queries to extract data from GE Sales Management systems
  • Extracted Data from Different SAP Source systems tables using Informatica.
  • Created schedulers for daily running of workflows to populate data into warehouse.
  • Responsible for Unit Testing, Integration Testing and helped with User Acceptance Testing.
  • Involved in Initial loads, Incremental loads and Daily loads to ensure that the data is loaded in the tables in a timely and appropriate manner.
  • Developed Informatica processes to extract data from source to staging.
  • Developed ETL processes to load various ODS tables, Dimensions and Facts.
  • Applied transformation rules by leveraging wide variety of Informatica transformations like SQL, lookup, aggregate and router transformations, etc.
  • Generated sessions and workflows and applied suitable properties for improvising the performance.
  • Performed Unit testing the implemented solutions in development and testing environment and deploying them in Production.
  • Tuned the performance of jobs by following Control - m best practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Scheduling jobs and implementing dependencies if necessary using Control-m.
  • Managed post production issues and delivered all assignments/projects within specified time lines.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Involved in Migrating the Informatica objects using Unix SVN from Dev to QA Repository.
  • Worked on developing workflows and sessions and monitoring them to ensure data is properly loaded on to the target tables.
  • Responsible for scheduling workflows, error checking, production support, maintenance and testing of ETL procedures using Informatica session logs.

Environment: SQL Server, TOAD 8.0, Packages, Triggers, Indexes, XML, Oracle, OLAP, Normalization, UNIX, Windows 7/8.1, UNIX Shell Programming.

Confidential, PA

ETL Developer

Responsibilities:

  • Migrated Legacy data from DB2 to Teradata.
  • Involved in DWH up gradation for source system changes.
  • Utilized all the features of Source Qualifier transformation such as filter, joiner, sorter and SQL override to the extend level at the source level.
  • Worked with External stored procedures for data cleansing purpose.
  • Extensively used Lookup transformation and Lookup Caches in looking the data from relational and Flat Files.
  • Created Mapping parameters and Variables and written parameter files.
  • Created UNIX shell scripts for various needs.
  • Involved in doing Unit Testing, Integration Testing and Data Validation.
  • Implemented various Performance Tuning techniques by finding the bottle necks at source, target, mapping and session and optimizing them.
  • Worked with the Debugger Wizard in debugging the mappings.
  • Extensively worked with session logs and workflow logs in doing Error Handling and trouble shooting.
  • Worked with the Control M scheduling team in scheduling Informatica Jobs as per the requirement and frequency.
  • Worked with the Cognos team in generating various reports.
  • Involved in preparing the Migration Documents.

Environment : MS SQL Server 2014/ 2012/2008 R2/2008, SQL Server Data Tools (SSRS /SSAS/SSIS), T-SQL Enterprise Manager, C#, Microsoft Azure, Data Factory .

Confidential Chicago, IL

ETL Developer

Responsibilities:

  • Created queries in SQL Server Management Studio to verify all Device exceptions within the organization, making sure they are deployed or decommissioned.
  • Conducted meetings and emailed teams to discuss solutions to verify inventory records.
  • Built and supported the transformation of various data inputs into the Enterprise Data Warehouse.
  • Implemented enhancements to the current ETL programs based on the new requirements.
  • Identified data requirements from source systems and defined ETL processes for extracting data from external systems.
  • Transformed data from various sources like MS Excel, MS Access and SQL Server using SSIS Packages and import Export Wizard.
  • Performed data analysis and data profiling using complex SQL Queries on various sources systems including MS SQL Server.
  • Performed Tuning of the ETL programs to increase the processing and loading efficiency.
  • Analyzed & designed solutions, create technical specifications for integration and consult with developers and QA personnel on their tasks related to the imports and extracts.
  • Responsible for final work product within assigned projects as related to data, data design, transformations and quality of results.
  • Involved in Production Support on Rotation Basis to run and Monitor Daily, Weekly, Monthly, Quarterly and Adhoc loads.
  • Coordinated with Informatica Admin teams to deploy Informatica code across multiple environments.
  • Worked Extensively with Business Analysts and Users to understand the Business Requirements and created ETL Mapping Documents.
  • Developed ETL processes detailing the various extraction patterns to handle interfaces such as Queues, database pulls, flat files
  • Performed Data Profiling on source systems data and provided statistics to Business for providing cleansed data.
  • Provided Data quality assurance & reporting.
  • Implemented the best practices for the creation of Mappings, sessions and Workflows and performance optimization.
  • Able to extract and manipulate data in a large data warehouse environment.
  • Demonstrated ability to create insightful analytic results and communicate results and methodology effectively with partners.
  • Developed complex Stored Procedures, Functions, SSIS packages, Triggers, Cursors, Tables, Views, Indexes and other SQL joins and statements for applications.
  • Migrated data from various branches of the company using MS SQL Integration Service
  • Corrected/Improved the ETL process of defining defaults within Data Warehouse by giving ETL full control of assigning defaults instead of database defaults, thereby giving ETL visibility of default assignment and simplifying the Lookup process.
  • Accurately translated the Business requirement for the expiration of duplicate records
  • Scheduled and performed data transfers using SSIS including filtering to the appropriate data mart for data analysis.
  • Developed and edited many complex reports with SSRS.
  • Designed and implemented user login and security.
  • Created different types of reports including drill down, drill through and parameterized.
  • Reorganized database structures as needed, automated procedures at regular intervals.
  • Optimized long running Stored Procedures and Queries for effective data retrieval.

Environment: Informatica Power Center 8.5.1, Oracle 11g, PL/SQL, SQL Plus, TOAD.

Confidential, Culver City, CA

ETL Developer

Responsibilities:

  • Worked with various Sources such as Oracle, SQL Server, DB2,Teradata and Flat/Excel files
  • Developed strategies for data extraction from various source systems, transformation and loading into data warehouse target systems.
  • Participated and architected in schema, object design discussion.
  • Worked to improve the efficiency of data warehouse application through the use of indexes, compression, materialized views and other techniques
  • Define and implement ETL development standards and procedures for data warehouse environment.
  • Lead the design of the ETL solution; oversee the ETL low-level design and design phase deliverables
  • Responsible for performing T-SQL tuning, multi-tasking and optimizing queries for reports which take longer time in execution with SQL Server 2008/2005.
  • Created Complex ETL Packages using SSIS/SSRS to extract data from staging tables to partitioned tables with incremental load.
  • Participated in execution and documentation of Unit testing of various modules.
  • Involved in audit and reconciliation of migrated data.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Deployed Informatica Processes in Multiple environments and supported the initial loads and incremental loads.
  • Created SSIS Reusable Packages to extract data from Multi formatted Flat files, Excel, XML files into UL Database and DB2 Billing Systems.
  • Designed and created complex source to target mappings using various transformations
  • Developed PL/SQL routines for migrating data related to various entities like subtotals, products, customers and etc.
  • Developed, deployed, and monitored SSIS Packages.
  • Created SSIS Packages using SSIS Designer for export heterogeneous data from OLE DB Source (Oracle), Excel Spreadsheet to SQL Server 2005/2008.
  • Performed operations like Data reconciliation, validation and error handling after Extracting data into SQL Server.
  • Worked on SSIS Package, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.
  • As a developer I was responsible for the systems study, analysis, understanding systems design and database design.
  • Designed and created SQL Databases, tables, indexes, and views based on user requirements.

Environment : Informatica Power Center 9.5/9.1, CDMA, ABC, Facets, Toad, Tidal, Teradata, Fast load, Fast export, BTEQ, Unix, Informatica servers on Unix Putty, TOAD, MS VISIO, MS Office Suite, TDQ, Oracle, MVS, oracle 11g,SQL Server, Windows NT/2000.

Confidential, Austin, TX

ETL Developer

Responsibilities:

  • Used Informatica PowerCenter 10.x,9x/8.6.1/8.5 and its all features extensively in migrating data from OLTP to Enterprise Data warehouse.
  • Developed mappings to extract data from SQL Server, Oracle, Teradata, SFDC, SIEBEL, Flat files to load into Teradata using the PowerCenter
  • Proficiency in HL7 on standards for interoperability of health information technology.
  • Involved in analyzing and development of the Data warehouse.
  • Performed tuning and optimization of complex SQL queries using Teradata Explain.
  • Expert level knowledge of complex SQL using Teradata functions, macros and stored procedures.
  • Experience in upgrading from Informatica version 9.6 to 10.1.
  • Used ODI commands like ODIFile Move, odiFileAppend, odiFilecopy etc
  • Created ODI packages, scenarios using interfaces, variables, procedure
  • Used ETL/SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • Used Pentaho Data Integration Designer to create ETL transformations
  • Design and development for the Crystal Reports which work as a feed for the dashboard.
  • Cleanse and Scrub the Data in uniform data type and format applying MDM Informatica and IDQ tools and Load to STAGE and HUB tables and then to the EDW and Finally to Dimension and rollup/aggregate the data by the business grains into the FACT tables.
  • Worked on power exchange to create data maps, pull the data from mainframe, and transfer into staging area. Wrote numerous BTEQ scripts to run complex queries on the Teradata database.
  • Developed Complex transformations, Mapplets using Informatica to Extract, Transform and load (ETL) data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).

Environment: Informatica 8x/7x/5.2/6.2, ORACLE 8i/9i, UNIX, Windows NT 4.0, UNIX Shell Programming, PL/SQL, Business Objects 4.7, TOAD Quest Software, Business Objects XI R3.1/RI 2.0, Business View Manager,, MS Excel, Live Office.

Tek Resource, Austin, TX

BI Developer

Responsibilities:

  • Developed mappings/Reusable Objects/Transformation/Mapplets by using mapping designer, transformation developer and Mapplets designer in Informatica Power Center 9.6/9.5/9.1.
  • Involved in Design and development of new data warehouse (Analytical Data Warehouse) for better Reporting and analysis.
  • Involved in Data Modeling of the Oracle Database and Debugging Stored Procedures.
  • Worked on developing Informatica Mappings, Mapplets, Sessions, Work lets and Workflows for data loads.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence
  • Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations Using delimiters to define the source document structure for HL7 standards.
  • Developed Mappings which loads the data in to Teradata tables with SAP definitions as Sources.
  • Analyzed complex ETL requirements/tasks and provided estimates/ETCs.
  • Information is stored in table definitions in the repository and is entered using the DataStage import/export options.
  • Good Knowledge on applying rules and policies using ILM (Information Life Cycle Management) workbench for Data Masking Transformation and loading into targets.
  • Worked as in ILM architect to design and implement Informatica ILM products with proper strategy which includes Data Masking, Data Archival.
  • Primary activities include data analysis identifying and implementing data quality rules in IDQ and finally linking rules to power center ETL process and delivery to other data consumers.
  • Used advanced features of T-SQL in order to design and tune T-SQL to interface with the Database and other applications in the most efficient manner and created stored Procedures for the business logic using T-SQL.
  • Worked on UNIX and Oracle SQL Developer to develop queries and create procedures and Package Oracle.
  • Worked with data warehouse staff to in corporate best practices from Informatica.
  • Developed complex mappings and SCD type-II mappings in Informatica to load the data from various source to ODS tables Hands on experience with data steward tools like data merge and hierarchy manager in MDM.
  • Used Informatica to extract data from DB2, HL7, XML and Flat files to load the data into the Teradata
  • Created and modified Informatica Mappings and Workflows to load flat files to Oracle database.
  • Data Loads to warehouse by extracting data from sources like Oracle and Delimited Flat files.
  • Worked with ETL Migration Team and Migrated Informatica folders from Dev to Test repository
  • Profiled the data using Informatica Analyst tool to analyze source data (Departments, party and address) coming from Legacy systems and performed Data Quality Audit.
  • Performed Unit testing, Integration testing, and coordinated with QA for UAT for code change and enhancement.
  • Developed mappings in Informatica to stack the information from different sources into the Data Warehouse, utilizing distinctive changes like Source Qualifier, JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Used ODI 11g/12c for extraction, loading and transformation ETL of data in the data warehouse
  • Worked with various IDQ transformations like Standardizer, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, and Expression.
  • Working with Different Teradata source systems to analyze data and load it into the Data warehouse.
  • Developed complex Talend ETL jobs to migrate the data from flat files to database
  • Implemented custom Error handling in Talend jobs and worked on different methods of logging
  • Followed the organization defined Naming conventions for naming the Flat file structure, Talend Jobs and daily batches for executing the Talend Jobs
  • Study and analyze source data systems in terms of business usage and data quality. Active involvement in data model design and MDM configuration proposal and data profiling.
  • Worked on Informatica profiling for huge number of health care data.

Environment: Informatica Power Center 9.1, Oracle 10g/9i, SQL, PL/SQL, SQL*Plus, Sybase, XML files, Sun Solaris 5.8, Windows 7/8, TOAD 7.0, PL/SQL Developer, Crystal Reports XI, MS Excel, Import Wizard, Central Management Console, Info View, True Task Scheduler, SQL Server Management Studio

We'd love your feedback!