We provide IT Staff Augmentation Services!

Senior Informatica Consultant/data Engineer Resume

Costa Mesa, CA

SUMMARY

  • 10+ years of in Data Integration and Data Warehousing using ETL tool INFORMATICA PowerCenter 10.X/9.6/9.1/8.6 (Source Analyzer, Warehouse Designer, Mapping/Mapplet Designer, Sessions/tasks, Worklets /Workflow Manager). Knowledge of Informatica tools Power Exchange, Power Analyzer, Power Connect, Metaquery, Data Mart, OLAP and OLTP.
  • Extensively used Enterprise Data warehousing ETL/ Business Intelligence methodologies for supporting data extraction, transformation and loading processing, in a corporate - wide-ETL Solution using Informatica Power Center.
  • Expert-level mastery in designing and developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, Applications, COBOL Sources and Teradata.
  • Worked in Hadoop, HDFS, Hive, Impala, HQL Queries and Sqoop.
  • Extensive Data Warehouseexperience using InformaticaPower Center, Informatica PowerExchange (CDC) for designing and developing transformations, mappings, sessions.
  • Proficient with Informatica Big Data Developer, Informatica Data Quality(IDQ) for cleanup and massaging at staging area.
  • Proficient in implementing Complex business rules by creating re-usable transformations, workflows/worklets and Mappings/Mapplets.
  • Experience in Performance Tuning of targets, mappings and sessions.
  • Developed shell /python scripts to handle incremental loads
  • Thorough knowledge of database management concepts like conceptual, logical and physical data modeling and data definition, population and manipulation. Expertise in logical and physical database design and Data Modeling using data modeling tools like Erwin.
  • Experienced in using Tableau 10.4, Business Objects XIR3/XIR2 to build user defined queries and reports to enable drill-down and slice and dice analysis on multiple databases.
  • Strong expertise in Relational data base systems like Teradata,Oracle, SQL Server, MS Access, Sybase, DB2 design and database development using SQL, PL/SQL, SQL PLUS, TOAD, SQL-LOADER. Highly proficient in writing, testing and implementation of triggers, stored procedures, functions, packages, Cursors using PL/SQL.
  • Experience in scheduling of ETL jobs using Control-M, Tivoli, CA Workload Automation.
  • Created scripts using Fast Load, Multi-Load to load data into the Teradata data warehouse.
  • Good Knowledge on SFDC.
  • Domain Data warehousing experience including banking, Insurance, Health, Credit card, Utilities and Pharmaceutical. Proven ability to implement technology-based solutions for business problems.
  • Designed and written the scripts required to extract, transform, load (ETL), clean, and move data and metadata so it can be loaded into a data warehouse, data mart, or data store.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 10.X/9.X/8.X, SAP, Power Exchange 5. IDQ, SSIS.

OLAP/BI: Cognos, Spot Fire, Cognos IWR, OBIEE, Business Objects 5.0/6.5.

Data Modeling: Erwin 4.0, Star-Schema Modeling, FACT and Dimension Tables

DBMS: Oracle 11g/10g/9i/8i, Microsoft Access, SQL Server 2005/2008, MS Excel, Flat Files, Teradata V13.0/12.0, Sybase, Netezza

Languages: C, C++, Java, JavaScript, Python, SQL, PL/SQL, T-SQL, HTML, DHTML, XML, UNIX, Shell Scripting, Visual basic, ASP, JSP, XML, Macromedia Software, JCL.

Scripting Languages: Java Script, VB Script and UNIX Shell Scripting.

Operating Systems: Windows 2008/2003/NT/XP, HP-Unix, Linux, AIX, Linux

Design Skills: Object Oriented Analysis Design using UML.

Others: MS Project, MS Office Suite, Toad (Tool of Oracle Application Development).

PROFESSIONAL EXPERIENCE

Confidential

Senior Informatica Consultant/Data Engineer

Responsibilities:

  • Work closely with Project Teams and participate in full software development life-cycle from requirements gathering, estimating, designing, development, and implementation.
  • Design, develop, test, and document best practice and scalable Informatica workflows, mappings, mapplets and transformations.
  • Ensure that ETL jobs fully conform to governor limits, API batch restrictions, and all other requirements of integrating with the Salesforce.com platform.
  • Involved in the development of Informatica mappings and tuned for better performance.
  • Tune performance of Informatica jobs to ensure optimal efficiency and performance.
  • Analyze the legacy source systems to identify business rules around data profiling, data flows, data transformations, and data cleansing.
  • Perform validations against transactional records and make corrections to resolve data quality issues.
  • Participate in code review and recommend solutions to enhance ETL programs to project team and management.
  • Create Structured Query Language (SQL) scripts. (Stored Procs, SQL Functions,)
  • Creates deployment instructions including failure recovery procedure, and monitors successful data load of deployments.
  • Developed Python Scripts to parameterize the sessions parameters during the decomposition.
  • Making changes in the configuration/Parameter files and performance tuning of existing Informatica jobs.
  • Data loading, data conversion, ensuring data validation and loading of error & audit tables.
  • Participate in multiple mock data conversion tests to validate the conversion process and reduce implementation risk.
  • Design, develop, and support ourdata warehouse using tools and techniques, design integration solutions involving IICS and Informatica Powercenter.
  • Develop Python Scripts to generate the JIL file to create the Autosys Jobs.
  • Works with Quality Assurance Team in developing test scripts and test plans for validating the accuracy of ETL jobs and the quality/integrity of loaded data.
  • Assist in post implementation-verification support.
  • Work in IICS Application Integration components like Processes, Service Connectors, and Process Object
  • Well versed with all Informatica Client Components (PowerCenter Designer, Workflow Manager, Workflow Monitor, Repository Manager)
  • Mentors/coaches others in Informatica, ETL and Data Warehousing by sharing knowledge, best practices and lessons learned from relevant past experiences.
  • Provide production on-call support to existing systems.
  • Assist in continuous improvement efforts and enhancing project team methodology and performance.

Environment: Informatica Power Center 10.2 & 10.4, SFDC, Oracle 11g, SQL Server 2017, Service Now, Toad 10, DB Visualizer, Toad Data modeler, shell/Perl scripting, Python, SQL, Control-M.

Confidential

Senior Informatica Consultant/Data Engineer

Responsibilities:

  • Responsibilities include Working closely with business analysts to understand and document business needs for decision support data.
  • Used the Update Strategy Transformation to update the Target Dimension tables.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Involved in the development of Informatica mappings and tuned for better performance.
  • Created Oracle database, users, base table, and views using proper distribution key structure.
  • Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
  • Developed mapping parameters and variables to support connection for the target database as Oracle and source database as Oracle OLTP database.
  • Created Mapplets and used them in different Mappings.
  • Used designer debugger to test the data flow and fix the mappings. Tuned Informatica Mappings and Sessions for optimum performance.
  • Developed shell /python scripts to handle incremental loads.
  • Provided detailed technical, process and support documentation like daily process rollback and detailed specifications and detailed document of all the projects with the workflows and their dependencies.
  • As an ETL/Informatica Software Developer/Programmer worked in Confidential ’s BSA-AML Compliance (Bank Secrecy Act-Anti-Money Laundering) Technology the group in the ALM, Capital Markets, Finance and Risk (ACFR) organization within Technology Line of Business
  • Worked on ETL development of the Oracle Mantas application in OFSAA (Oracle Financial Services Analytical Applications).
  • Worked with mapping parameters and variables to load data from different sources to corresponding partition of the database table.
  • Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
  • Created Schema objects like Indexes, Views, and Sequences.
  • Extracting the data from BRD files, Flat files & Oracle and load them through Informatica.
  • Worked with Crontab for job scheduling.
  • Production Support and issue resolutions.
  • Involved in Unit testing of Mappings, Workflows and debugging mappings for failed sessions.
  • Created partitions, SQL override in source qualifier, session partitions for improving performance.
  • Performed unit testing and system testing of the mappings developed and documented with various scenarios.

Environment: Informatica Power Center 10.2/10.1, SQL Server 2017, Python, Azure Data bricks, Oracle 11 g, PL/SQL, XML, Linux, Putty, WinSCP

Confidential, Costa Mesa, CA

BI Developer

Responsibilities:

  • Working on the Enterprise level Datawarehouse (EDW).
  • Working on different data marts and data lake for different domains i.e Automotive, Emergency Roadside Assistance, Insurance, Travel, Membership and Payments etc.
  • Working on Informatica Cloud, AWS, Hadoop, Hive, Impala and HQL queries for different databases.
  • Extract transform and load data through Informatica Power Centre and Informatica Developer.
  • Creation of design document ofdata load process.
  • Parsed out the Informatica workflows to a grain of 1:1 workflow: session, using python scripts and manipulated the workflow. XML files to configure the connections.
  • Loading data from/to Teradata, SQL Server, Oracle, Hadoop, Hive and API.
  • Used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange of Informatica.
  • Used Informatica B2B Data Exchange to Structured data like XML.
  • Export & Import of workflows in different environments.
  • Making changes in the configuration/Parameter files and performance tuning of existing Informatica jobs.
  • Data loading, data conversion, ensuring data validation and loading of error & audit tables.
  • Maintaining the data retention policy of the organization.
  • Scheduling the ETL jobs in Control M.
  • Working of Agile Methodology and Implementing in Clarizen and JIRA.

Environment: Informatica Power Center 10.2, Informatica Developer, Teradata, Oracle 11g, SQL Server, Python, Hadoop, HDFS, HIVE, Impala, Sqoop, Pipe delimited Flat Files,XML, JSON, WinSCP,Tableau, Business Objects, Erwin 7.2 and Oracle 11g.

Confidential, Charlotte, NC

Informatica Developer

Responsibilities:

  • Worked on the Outages Management System.
  • Customer data flows from Inservice 9.2 (OMS Database to downstream application)
  • From OMS database data stored in staging table where data is at granular level of Customer Account Number & Premise ID.
  • Creation of design document ofcustomer load process.
  • Creation of informatica mappings, mapplets, worklet & workflows.
  • Load data from SQL Server, Flat Files & Oracle to Oracle Database.
  • Created the SFDC, Flat File and Oracle connections for AWS Cloud services.
  • Performance tuning of existing Informatica jobs.
  • Export & Import of workflows in different environments
  • Making changes in the configuration/Parameter files already developed for OMS History, Proactive Communication, IFactor (Outages Maps) and Reliability Metric.
  • Data loading & data conversion (customers IDs & premise ID's pattern changed)
  • Creation & loading of error table and audit tables, and Ensuring Data validation
  • One-time conversion of transactional tables in OMS Primary, Archival and NRT (Near Real Time) databases.
  • Weekly full load customer data by ETL Process.
  • Maintaining the data retention policy of the organization.
  • Scheduling the ETL jobs in CA Workload Automation.
  • Working of Agile Methodology and Implementing in JIRA.
  • VB Script for the loading of data and worked on Amazon RedShift cloud data integrator 10

Environment: Informatica Power Center 10.2, SAP HANA, Oracle 11g, SQL Server 2017, Pipe delimited Flat Files, Amazon web services (AWS) cloud, Amazon RedShift cloud data integrator 10, Oracle 11g, Oracle Exadata, XML

Confidential, Bartlesville, OK

Informatica Developer

Responsibilities:

  • Created High Level and Low Level design document and ETL standards document.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Extracted data form flat files, Mainframes, DB2 and oracle database, applied business logic to load them in the central oracle database.
  • Designed & developed Informatica mappings, mapplets, worklets and workflows to create load-ready data files for loading Oracle E-Business Suite.
  • Designed and developed Reports for the user Interface, according to specifications given by theTrack leader.
  • Involved in performance tuning at source, target, mapping and session level.
  • Loaded Oracle tables from XML sources.
  • Configured Informatica for the SAP Connector.
  • Extracted data from SAP and loaded into Oracle EBS
  • Introduced the concept of Data Dashboard to track the technical details sighting the continuous requirement changes and rework needed.
  • Worked on creating physical layer, business model, mapping and presentation layer in OBIEE.
  • Created source system containers, subject areas, narrative reports in containers for OBIEE.
  • Retrieved data from SAP using Informatica Power Exchange.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Resolved Skewness in Teradata.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Used Informatica web services to create work requests/work Items for the end user.
  • Successfully Integrated Multiple XML sources and created a de-normalized, flat-structured file.
  • Supported Integration testing by analyzing and fixing the issues.
  • Created Unit Test Cases and documented the Unit Test Results.
  • Used Perl scripts to archive the older logs using Informatica command task.
  • Defined Interface parameters in Informatica mappings and sessions to initiate the concurrent programs (Stored Procedures) once the Interface tables are loaded.
  • Integrated Data Quality routines in the Informatica mappings to standardize and cleanse the name, address, and contact information.
  • Integrated the sales force data into target Oracle using Informatica cloud.
  • Validated the sales force target data in force.com application.
  • Created Invoices, Cash Receipts, RMA, RMA Start into Sales Force from Oracle EBS.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Created customized OBIEE model in the RPD to retrieve the RPD data into dashboard.
  • Scheduled the workflows to pull data from the source databases at weekly intervals.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
  • Performance tuning on sources, targets, mappings and database.
  • Worked as production support SME to investigate and troubleshoot data issues coming out of Weekly and Monthly Processes.
  • Worked with business to provide them daily production status report in the form of issues, their priority and business impact along with recommended short term and long term solution.
  • Used database level Greenplum partitioning and Informatica hash partitioning.

Environment: Informatica Power Center 10.2, SAP BW, SFDC, Oracle 11g, DB2, SQL Server 2017, Service Now, Toad 10, DB Visualizer, Unix, Toad Data modeler, Green Plum DB, DB2, Linux Susse, shell/Perl scripting, SQL, Control-M.

Confidential, FL

Informatica Developer

Responsibilities:

  • Interacted actively with Business Analysts and Data Modelers on Mapping documents and Design process for various Sources and Targets
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and
  • Analyzing, designing and implementing complex SQL stored procedures, ETL processes and Informatica mappings.
  • Used Tidal scheduler to get the source file from the server using Tidal flat file FTP connection as well as power center FTP connection.
  • Created Oracle Exadata database, users, base table and views using proper distribution key structure.
  • Used Informatica Power Connect for Oracle Exadata to pull data from Oracle Exadata data warehouse.
  • Developed mapping parameters and variables to support connection for the target database as Oracle Exadata and source database as Oracle OLTP database.
  • Ran the all workflows using the Tidal scheduler.
  • Worked on Pentaho integration installation like content management, execution, security, scheduling etc.
  • Worked with tool bar, perspective tool bar, sub-tool bar and design and view tabs of Pentaho data integration.
  • Created Pentaho ELT jobs and did the performance monitoring and logging.
  • Configured Pentaho’s thin kettle JDBC driver.
  • Executed transformations for debugging in Pentaho DI. worked on L2 and L3 production suppport/monitoring of the daily nightly loads of ETL.
  • Implemented SCD1, SCD2 type maps to capture new changes and to maintain the historic data
  • Providing technical assistance during production phase of project development
  • Defined and developed technical standards for data movement and transformation as well as review all designs to ensure those standards are met.
  • Handled extraction of various types of source files Flat files, XML standard source data of different transactions and loading to staging area.
  • Configured Informatica for the SAP Connector.
  • Extracted data from SAP and loaded into Oracle EBS.
  • Configured power exchange for SAP R3
  • Retrieved data from SAP R3.
  • Designed and written the scripts required to extract, transform, load (ETL), clean, and move data and metadata so it can be loaded into a data warehouse, data mart, or data store.
  • Installed and configured Informatica Power Exchange for CDCand Informatica Data Quality (IDQ).
  • Created CDC (change data capture) sources in Power Exchange and imported that into Power Center
  • Created custom plans for product name discrepancy check using IDQand incorporated the plan as a mapplet into Power Center.
  • Configured Informatica Power Exchange add on for SAP (Power Connect)
  • Retrieved data from SAP IDocs using Informatica connector.
  • Used Unstructured Data like PDF files, spreadsheets, Word documents, legacy formats, and print streams option to get normalized data using B2B Data Exchange of Informatica.
  • Used Informatica B2B Data Exchange to Structured data like XML.
  • Worked extensively in PL/SQL to migrate the data from Oracle to Oracle Exadata database.
  • Experience in SQL Server & SSRS Reports Migration from SQL Server 2000 to SQL Server 2005, SQL Server 2008 to SQL Server 20012 and SQL Server 2008 to SQL Server 2012 R2.
  • Created Drill Down, Drill through, linked and sub reports using SSRS and resolved issues and errors in SSRS for errors generated.
  • Created new mappings and enhancements to the old mappings according to changes or additions to the Business logic.
  • Converted Oracle ddl’s to Netezza ddl’s.
  • Created the format of the unit test documents per Netezza Framework.
  • Created the NZ/SQL Procedures on Netezza using Workbench Aginity.
  • Retrieved error logs on UNIX for Netezza data loads from Oracle to Netezza.
  • Optimized the NZ-SQL queries.
  • Optimized the BOXI dashboard SQL with aggregated and sub queries.
  • Retrieved data from simple object access protocol (SOAP) using the existing XSD from an XML file using web services hub.
  • Retrieved data from web services and validated the response using Informatica expression transformation like date, zip, location formats etc.
  • Configured Informatica web services hub in administration console.
  • Worked with Informatica web services
  • Created scripts using Fast Load, Multi-Load to load data into the Teradata data warehouse.
  • Created the Visio diagram for Informatica workflows to be scheduled in DAC.
  • Scheduled workflows in DAC and populated the Reporting layer of OBIEE.
  • Created the source system containers, customized the data warehouse load, manipulated the columns in data warehouse tables etc in DAC.
  • Created the custom logical/physical folders in DAC. synchronization from SFDC to Oracle though cloud based on demand.
  • Created projects in ILM for data masking with different parameters like commit interval, encryption key and degree of parallelism.
  • Created the expression, encryption, SSN replacement data masking techniques in data masking transformation of Informatica Power Center
  • Did data masking for the limited trust zone using data masking transformation of Informatica power center
  • Integrated the data from Oracle to Sales Force (SFDC) using Informatica cloud.
  • Worked on Migration of mappings from Data Stage to Informatica.
  • Updated numerous Bteq/Sql scripts, making appropriate DDL changes and completed unit and system test
  • Validated the salesforce target data in force.com application.
  • Configured SFDC license in administration console.
  • Creating automated schedules to run tasks at specific time as per need to migrate date to/from SFDC and DB and ERP.
  • Extensive knowledge on Master Data Management (MDM) concepts.
  • Extensive experience on Designing, Managing and administrating MDM/DIW objects using Kalido DIW/MDM8.5/9 tool.
  • Worked on designing catalogs, categories, sub-category and user roles using Kalido MDM 9.
  • Extracted data from SalesForce legacy system, SalasVision, Charles River (Trading Platform).
  • Provided enterprise data warehousing solutions, including design and development of ETL processes, mappings and workflows using Informatica’s Power Center.
  • Created the SalesForce connections in Informatica Power Center
  • Expert in designing and scheduling complex SSIS Packages for transferring data manually from multiple data sources to SQL server.
  • Expert in creating, configuring and fine-tuning ETL workflows designed in DTS and MS SQL Server Integration Services (SSIS).
  • Profiled customer data and identified various patterns of the phone numbers to be included in IDQ plans.
  • Worked on Windows PowerShell for automation of remote work stations.
  • Worked on Ab Initio graphical development environment (GDE).
  • Created Ab initio graphs using inputs (Flat Files), reformat, join, Rollup, concatenate, output components.
  • Worked on creating parallel Ab initio jobs for faster reads.
  • Designed the Informatica mappings based on AB Initio code.
  • Fixed the existing components by comparing the Informatica code with Ab Initio graph.
  • Worked with component object model (COM) and windows management instrumentation (WMI) using windows powershell scripting.
  • Created windows powershell scripts in cmdlets in in windows powershell environment.
  • Used pipeline in windows powershell to enable one cmdlet to be piped into another cmdlet.
  • Configured SFDC license in administration console. by CouponDropDown">Participate din design workshops, providing technical insight and knowledge.

Environment: Informatica Power Center 9.6.1,IDQ, Oracle 11G,Netezza 7.2.0, NetezzaTwinFin 6 (Production), TwinFin 3 and Netezza Skimmer (Non-production), Exadata, web services, DAC, Ab Initio,Teradata V13.0, Cognos BI 8.3,Informatica B2B Data Transformation (DT)/Data Exchange (DE), data masking, MDM, Relational Junction,DB2, Flat files, SSIS, PL/SQL, SQL*Plus, TOAD, UNIX, SAP, Shell Scripting, Autosys, Big Data, Erwin 4.2. Tidal scheduler.Hadoop. (SFDC). Windows PowerShell.

Confidential, Houston, TX

Informatica Developer

Responsibilities:

  • Running batch cycles which involves job trigger from Informatica, query table in Teradata /OracleSQl developing, report generation and claims archiving maintenance.
  • Involved in creating stored procedures and using them in Informatica
  • Implementing claim Data Conversion process is to move claims from Claims Workbench to the CNG Navigator application.
  • Providing technical assistance during production phase of project development
  • Defined and developed technical standards for data movement and transformation as well as review all designs to ensure those standards are met.
  • Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions
  • Designed Work Flows that uses multiple sessions and command line objects (which are used to run the Unix scripts)
  • Created source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Provided enterprise data warehousing solutions, including design and development of ETL processes, mappings and workflows using Informatica’s PowerCenter.
  • Responsible for migration of the work from dev environment to testing environment
  • Provided guidance and expertise to resolve technical issues related to DW Tools and primarily Informatica.

Environment: Informatica Power Center 9.0.1, Oracle 10G, Cognos BI 8.3, Relational Junction,, DB2, Flat files, PL/SQL, SQL*Plus, TOAD, UNIX, Shell Scripting, Control-M, Erwin 4.2.

Confidential, Portsmouth, NH

Informatica Developer

Responsibilities:

  • Worked with Informatica PowerMart client tools like Source Analyzer, Warehouse Designer, and Mapping.
  • Worked on Informatica tools like Source Analyzer, Target Designer, Mapping Designer, Workflow Manager, and Workflow Monitor.
  • Extracted data from Flat files loaded them into EDW.
  • Worked on multiple inbound ASN files to create XML, created a list file with the entire Inbound ASN files list on UNIX and use indirect loading and transforming methods in Informatica
  • Handled extraction of various types of source files Flat files, XML standard source data of different transactions and loading to staging area.
  • Wrote Data loading stored procedures, functions using PL/SQL from Source systems into operational data storage.
  • Involved and Worked with DBA to build Dimension Tables and Fact Tables.
  • Created Complex Mapping for generating the parameter files.
  • Created source and target mappings, transformation logic and processes to reflect the changing business environment over time.
  • Designed and developed SSIS packages, store procedures, configuration files, tables, views, and functions; implement best practices to maintain optimal performance.
  • Utilized SSIS (SQL Server Integration Services) to produce a Data Warehouse for reporting.
  • Mainly involved in developing Star Schema (Facts and dimensions based on future requirements).
  • Developed source to target mappings and scheduling Informatica sessions.

Environment: Informatica Power Center 8.6, Oracle 10G, WLM, UNIX AIX, DB2, Flat files, PL/SQL, SQL*Plus, TOAD, UNIX, Shell Scripting, Tivoli, Erwin 4.2

Hire Now