We provide IT Staff Augmentation Services!

Sr Informatica Developer Resume

2.00/5 (Submit Your Rating)

Irving, TX

SUMMARY:

  • 7+ years of strong experience in Informatica with tools and ETL and System Analysis & Design, Development, Testing, Integration, and Production Support using for Data warehousing on Client Server and Web - Enabled applications.
  • Experience the ultimate in performance for bulk data integration and data warehousing processes spanning a variety of cloud and on-premise applications, all through an intuitive web-based UI st rong business analysis skills and an understanding of the software development life cycle (SDLC) utilizing Rational Unified Process (RUP)..
  • Extensive experience in Data warehousing, Data Architecture & Extraction, Transformation and ETL data load from various sources into Data Warehouse and Data Marts using Informatica Power Center.
  • Involved in all aspects SDLC of ETL including requirement gathering, data cleaning, data load strategies, mappings design & development, providing standard interfaces for various operational sources, unit/ integration/regression testing and UAT..
  • Created Talend ETL jobs to receive attachment files from pop e-mail using tPop, tFileList and tFileInputMail and then loaded data from attachments into database and achieved the files
  • Experienced B2B Data Transformation Studio& Informatica B2B Data Exchange
  • Experience in writing SQL queries and optimizing the queries in Sybase, Oracle and SQL Server 2000.
  • Expertise in broad range of technologies, including business process tools such as Microsoft Project, Primavera, Pro model, MS Excel, MS Access, MS Visio, technical assessment tools, Data Warehousing concepts and web design and development.
  • Creation of database objects like tables, views, materialized views, procedures and packages using oracle tools likeToad, PL/SQL Developer and SQL* plus.
  • Proficient in developing Use Case Model, Analysis Model, Design Model, Implementation Model, Use Case Diagrams, Behaviour Diagrams (Sequence diagrams, Collaboration diagrams, State chart diagrams, Activity diagrams), Class Diagrams based on UML using Rational Rose.
  • Implementing BI Security- authorization and authentication, Cache management, creating reports and configuring intelligent dashboards, Customizing BI server components. Enabling BI office Plug in. Upgrading OBIEE 11.1.1.5.4 to OBIEE 11.1.1.6.0 and OBIEE integration with Essbase .
  • Installed and configured the Business Object SAP and SE SAP for the databases .
  • Used of Business Object to provide performance management, planning, reporting, query and analysis, and enterprise information management.
  • Extensive success in translating business requirements and user expectations into detailed specifications employing Unified Modelling Language (UML).
  • Performed Gap Analysis to check the compatibility of the existing system infrastructure with the new business requirements.
  • Strong Knowladge of Cloud based Integration and system integration which deliverd as Cloud Services and used of Cloud Tools .
  • Worked on real-time, in-memory processing engines such as Spark, Impala and integration with BI Tools such as Tableau, OBIEE.
  • Extensive experience in developing the SOA middleware based out of Fuse ESB and Mule ESB and configured Elastic search and longstach, kibana to monitor spring batch jobs.
  • Excellent experience in designing, modelling, performance tuning and analysis, implementing processes using ETL tool Informatica Power enter for Data Extraction, transformation and loading processes.
  • Designing end to end ETL processes to support reporting requirements. Designing aggregates, summary tables and materialized views for reporting.
  • Expertise in using heterogeneous source systems like Flat files (Fixed width & Delimited), XML Files, CSV files, IBM DB2, Excel, Oracle, Sybase, SQL and Teradata.
  • Proficient in designing &developing complex mappings from varied transformation logic like Unconnectedand Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Update Strategy etc.
  • Worked on Repository Manager, Workflow Manager, Workflow Monitor and Designer to develop Mappings, Mapplets, Reusable Transformations, Tasks, Workflows, Worklets to extract, transform and load data..
  • Strong experience in client requirement analysis, physical, logical design development, resource planning, coding, debugging, testing, deployment, support and maintenance of business intelligence applications using SQL Server 2000/2005/2008/2012, DTS, SSIS, SSRS and SSAS 2005/2008/2012.
  • Experience designing and developing SQL statements and queries for Oracle Sybase and SQL Server 2000 database.
  • POC for integration to IBM BPM server for workflow integration into MDM using v10.1 software.

TECHNICAL SKILLS:

  • Informatica PowerCenter 9.1, 9.5/8.x/7.x/6.x/5.1
  • Informatica Power exchange
  • ERWIN
  • DataStage (7.x, 6.x, 5.x)
  • Star-Schema Modelling
  • Kettle
  • Snowflake Modelling
  • FACT and dimension tables
  • Erwin 4.0
  • Business Objects
  • Erwin 4.x
  • C/C++
  • Java SE 1.4
  • VB.Net
  • HTML
  • SQL Server
  • MySQL
  • Pentaho
  • Informatica MDM
  • Oracle 8i
  • TOAD
  • Sybase PowerDesigner (v4 - 11)
  • Sybase System 10
  • Exadata
  • Sybase ASE (11 - 12) and Sybase ASA (6 - 9).ASP/ASP.net
  • B2B DT 9.5
  • B2B DX 9.5
  • IDQ
  • Unix
  • MVS
  • MS Windows/95/98/NT/2000/XP
  • LINUX
  • MS DOS
  • MS Office 2000
  • Informatica 9.0.1, 8.1, 7.1
  • Oracle 11g, 10g
  • Teradata
  • PL/SQL
  • IBM MDM 8.5
  • Pentaho Kettle
  • Pentaho Analyzer
  • MDM Consultant
  • OBIEE 11.1.1.5.4
  • 10.1.3.4
  • B2B Data Transformation Studio.

PROFESSIONAL EXPERIENCE:

Confidential, Irving, TX

Sr Informatica Developer

Responsibilities:

  • Involved in Business Users Meetings to understand their requirements.
  • Installed MDM and used to enterprise to link all of its critical data to one file that provides a common point of reference.
  • Created MDM that contain database server and support the relational model and used object-relational features and non-relational structures like JSON and XML.
  • Installed MDMV11 for the Integration with Watson explorer facilities and It provide lots of new features like performance improvement to data export, Configurable search result display and Reliability enhancements to the data export to watson explorer .
  • Converted business requirements into technical documents- BRD, explained business requirements in terms of technology to the developers.
  • Created and managed source of target mapping document for all facts and dimensional tables.
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Used of Business Object to provide common services for deploymrnt and management tools of BI
  • Used of DB2 as a MVC frame and for traditional product packaging.
  • Used ETL Methadologies and best practices to create Talend ETL.
  • Used of REST API for Web Services which expose their own arbitrary sets of operations such as via WSDL and SOAP .
  • ReST API provide cacheable communications protocol -- and in virtually all cases, the HTTP protocol is used
  • Used of Infosphere B2B to design functions and web services .
  • Worked to manage of IBM Infosphere master data for single or multiple domains - customers, patients, citizens, suppliers, locations, products, services offerings, accounts.
  • Created Informatica Power Exchange Registration and Data map.
  • Created scripts to create new tables, views, queries for new enhancement in the application using TOAD.
  • Worked on Informatica cloud and extracted data from Sales Force source.
  • Used of Kettle java based architecture and open XML based configuration, It included support for integration of security and data management tools .
  • Used of Mongo DB as a open source software avoids the traditional table-based relational database structure in favor of JSON -like documents with dynamic schemas (MongoDB calls the format BSON ).
  • Created Pentaho Jobs and transformations to load data from CSV, Excel files to MYSQL database .
  • Created Jobs and transformations in Pentaho Kettle to move Oracle data pump files
  • Used of Infosphere Data stage for data integration and It capable of integrating data on demand across multiple and high volumes of data sources and target applications
  • Installed and configured Hadoop MapReduce, HDFS, developed multiple MapReduce jobs in java for data cleaning and pre- processing .
  • Installed Hive and used for Integrate the databases for computing environment .
  • Used of Hive for software design also and built in the Android Operating system.
  • Used of Apache Hive for stored data in various databases and file systems that integrate with Hadoop .
  • Involved in processing ingested raw data using Mapereduce, Apache, Spark and Kafka .
  • Developed Data Flow diagrams to create Mappings and Test plans.
  • Developed complex Informatica mappings using various transformations- Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation..
  • Used of Python to support multiple programming imperative and functional programming or procedural styles .
  • Used of Talend, It provide self-service tools to catalog, cleanse, and shape data from any source for use anywhere to the use of data and facilitates collaboration across batch, bulk, and master data management scenarios.
  • For the data, Talend provide data integration, application integration, master data management, cloud, product prepration and product pricing for that we can easily integrate the data.
  • Informatica debugger optimization, pre and post stored procedures to drop and build constraints.
  • Data load from file to Hadoop cluster, BigData summarization using Hive (using Pentaho) and loading of summarized data into Oracle and MYSQL for data visualizations and ad-hoc reporting
  • Worked on Audit login, Error login and Reference check while loading data to Data warehouse.
  • Created Unix Script for ETL jobs, session log cleanup and dynamic parameter.
  • Performed Unit testing, Integration testing and System testing of Informatica mappings.
  • Used of Python to support multiple programming imperative and functional programming or procedural styles .
  • Coded Unix Scripts to capture data from different relational systems to flat files to use them as source file for ETL process.
  • Used of Talend for open core model for big data, an integration application for big data and hadoop.
  • Designed various mappings for extracting data from various sources involving Flat files, Oracle, Sybase and SQL Server, IBM DB2.

Environment: Informatica PowerCenter 9.5/9.1, Informatica Power Exchange 9.1, Informatica Cloud, Python,IBM MDM V 11,B2B 10, Kettle, IDQ, MongoDB, HDFS, DB2, DB2 Mainframe, MDM Consultant Metadata editor, Shell, Scripting, Perl Scripting, Data Stage, Windows XP,Infosphere, Dataflux MDM,,, Informatica MDM OBIEE 11g, Oracle Exadata, Teradata13, MSSQL Server 2008 R2, T-SQL, SQL Server 2008, TOAD, PL/SQL Developer, Linux,Unix.

Confidential, Atlant, GA

Informatica Developer

Responsibilities:

  • Used Informatica power Center for extracting source data and loading into target table
  • Involved in complete life cycle implementation of Informatica MDM 9.1 implementation for Customer Master.
  • Performed the ETL coding using Informatica Power center 9.1 (Designer, Workflow Manager, Workflow Monitor) and B2B.
  • Installation/Config/Upgrade of Fusion Middleware Web logic (WSL), Discoverer 11g and configuring it with Oracle EBS R12.1.3.
  • Configured and Installed Oracle RAC 10g & Grid Control 10g on RHAS 4.0 - for applications based on Java OLTP.
  • Used of SAP Sybase designer and its features data modeling software and metadata management solution for data architecture, information architecture and enterprise architecture.
  • Used of Power Designer that brings powerful impact analysis, design-time change management and metadata management techniques .
  • Provide the data integration platform to used of kettle, integrate of different data sources for building and updating geospatial databases, data warehouse and services.
  • It provides a common set of application and data integration tools to build a service-oriented architecture, and connect, mediate, and manage services in real-time.
  • Installed, configured and tuned Informatica’s Data Quality Tool and implemented Informatica’s Data Quality applications for both business and technology users .
  • Performed IDQ at the point of entry for each mandatory attribute from each source system, attribute values are created way after the initial creation of the transaction.
  • Involved IBM Infosphere integrate MDM into existing business processes and technical architectures.
  • Used of OS(object System) to manage the group of different software’s and networks .
  • Transferred the data from database management system to another file to used of Kettle .
  • Worked on Informatica’s Data Quality as a service in a SOA or Web Services architecture .
  • Developed confidence factor scoring system, Run the Scorecard, Edit the Scorecard, Configure Thresholds and designed Score Carding patterns use of Informatica DQ.
  • Used DB2 on cloud and for its integration of data and for data storage, it’s a tool of DBMS
  • Installed of Cloud and Salesforce and all the tools of Cloud for integration .
  • Used of Informatica Cloud is easily connect to varity of cloud on-premises, mobile and social data sources .
  • Used of Data Stage for complex transaction and messages provide real time integration .
  • Used of Cloud Infosphere for development and testing environments and even move some of existing workloads to the cloud without impacting production systems.
  • Installed Informatica MDM platform on Web logic Application Server and did the configuration.
  • Designed and Developed ETL processes using datastage designer to load data into oracle database.
  • Created Unix Shell scripts for FTP/MFT, Error handling, Error reports, Parameter files etc.
  • Involved of in developing Pig UDFs for the needed functionality that is not out of the box available from . Collected, analyzed, monitored structured and unstructured data using ELK ( Elastic Search
  • Worked on column level data validation rules on the source data before the data enters the records by record of data stage of formatting and standardization.
  • Used DataStage Manager to implement the Import and Export Interfaces.
  • Track and report upon testing activities, including the test case execution of data stage, defect status if any defects opened during execution and the testing results status.
  • Performed the Informatica code review and ETL design review of other team members.
  • Managed Custom data while creating extended attributes to capture custom information and elements that are unique to client’s business.
  • Used SSIS packages to move logins and jobs from one server to another server.
  • Demonstrated out-of-box reports provided within the OPLA solution.

Environment: Informatica 9.1/8.6, Linux,Unix, Oracle RDBMS 10g, OBIA/BI APPS 7.9.6/7.9.5, Cloud, Python, SAP Sybase and features, Oracle 11g, SSIS, IBM DB2, Data Stage, HDFS, Spark, Kafka, Kettle, IDQ,, Datastage Enterprise 9.1/8.0/7.5,Windows Server 2003/XP/7,IBM Infosphere, Metadata editor, Informatica MDM Agile Product Lifecycle Management (PLM) for Process 6.0, Unix Shell Scripting,PL/SQL,Oracle Business Intelligence Enterprise Edition (OBIEE) 10.1.3.x,Netezza, OBIEE (11.1.1.6.9 ), Oracle Product Lifecycle Analytics (OPLA) 3.0, Verdant Data Loader 6.0, ETL.

Confidential

Informatica Developer

Responsibilities:

  • Interacting with Business and analysts to create the Functional Specs.
  • Designed API to describe the principal of Information hiding role of programming interfaces as enabling modular programming by hiding the implementation details of the modules.
  • Used the metadata of Informatica repository tables
  • Proficient in creating SQL Server reports, handling sub reports and defining query for generating drill down reports and drill through reports using SSRS 2005/2008.
  • SSIS Packages are created to export and import data from CSV files, Text files and Excel Spreadsheet.
  • Used IBM WebSphere DataStage for data quality. Setting up rules, best practices, and managing project resources.
  • Developing reports and intelligent dashboards for Global Sales team.
  • Created agents using obiee delivers to send emails in case of ETL load failures and for long running jobs.
  • Used statistical functions like regression to view the performance trends.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.
  • POCs for implementing Informatica scheduling on Job Automation tool UC4 and making ETL loads as flexible and restart able.
  • Developing reports and intelligent dashboards for Global Sales team.
  • Performance tuning required for slow running reports, designing of performance enhancing structures on Database.
  • Used of Cloud tools for integration of data and databases, it provide easy interface to access the data and retry.
  • Created Teradata External loader connections such as MLoad, Upsert and Update, Fastload while loading data into the target tables in Teradata Database.
  • POCs for implementing Informatica scheduling on Job Automation tool UC4 and making ETL loads as flexible and restart able.
  • Reviewing and creating the design required to support all reporting needs.
  • Getting signoff on report and dashboard template.
  • Optimized the SSRS reports through the use of aggressive scoping of data and judicious use of aggregate tables and materialized views and Caching techniques.
  • Created agents using obiee delivers to send emails in case of ETL load failures and for long running jobs.
  • Used statistical functions like regression to view the performance trends.
  • Used nested stored procedures with complex control flow logic to feed SSRS reports.
  • Reverse Engineering the Data Model from Erwin.
  • Involved in creating Data stage tables/mappings/ setting the trust level -HDD in Siperian for Customer Master system.
  • Coordinated with Database administrator team to make sure database gets executed correctly at data stage and production instances before loads can start.
  • Modeled Obiee RPD to use Informatica Repository tables to generate reports for ETL loads
  • For tracking the current status of loads, current/historic performance, throughput, differentiating long running jobs, view performance trends for individual Informatica sessions and developed metrics to view ETL performance.

Environment: Informatica Powercenter, Informatica MDM, OBIEE 11.1.6/11.1.5, PL/SQL,SQL, SQL Server,, Informatica Multi-domain MDM 9.1,SSIS, Informatica B2B DX DT v 8.0, Oracle 10g/11g, BI Publisher, API,Data Integration, Teradata, Data stage etc.

Confidential

ETL Developer

Responsibilities:

  • Provide suggestions to ETL team for performance improvement.
  • Developed Metadata repository (rpd), configured metadata the Presentation Layer, Business Model Layer & Physical Data Model.
  • Testing and reviewing the RPD
  • OBIEE Administration activities, RPD and web catalog deployment to ST, UAT and production.
  • Supporting UAT and fixing UAT trackers.
  • Implementation to production and supporting business checkout.
  • Constant interactions with the DBA and Infrastructure support teams to maintain dev/test environments space requirements and access issues.
  • Generated Dynamic reports from Cubes using Report Builder and SSRS.
  • Developed Variance reports using Reporting Services (SSRS)
  • Study and comprehend business requirements, constant interaction with business and come up with a detailed technical specification document.
  • Used different different features of Sybase power designer and DBAs with robust heterogeneous support for all leading databases, and brings impact analysis and design-time change management together with formal database design techniques.
  • It involved Sybase powerDesigner provides complete modeling for information architecture and unequaled traceability from data source to warehouse or mart and one cohesive, integrated metadata repository.
  • Most important reports are sourced from Portfolio Subject Area to manage the overall portfolio risk and sector concentrations.
  • Deployed SSRS reports to the reporting server and assisted in troubleshooting deployment problems.
  • IBM data quality tools of DataStage were used to help with the cleansing for migration .
  • Responsible for the management and migration of SSIS packages in the staging and pre-production environments.

Environment: ETL, OBIEE 10.1.3.4.0, Teradata, Java, C, Toad, Data Stage, Informatica 8.1.1, Sun Solaris, Windows, SSIS, SSRS, Oracle 10g, Clear Case, Sybase power Design, Unix Shell Scripting .

We'd love your feedback!