We provide IT Staff Augmentation Services!

 informatica Developer Resume

3.00/5 (Submit Your Rating)

Alpharetta, GA

SUMMARY:

  • Eight plus (8+) years of experience in the development, testing and Implementation of Business Intelligence solutions using Data Warehouse/Data Mart Design, ETL, Tableau, OLAP, BI, Client/Server applications with Teradata.
  • Extensively worked in ETL and data integration in developing ETL mappings and scripts.
  • Over 4 years of programming experience as an Oracle PL/SQL Developer in Analysis, Design and Implementation of Business Applications using the Oracle Relational Database Management System (RDBMS).
  • Hands on Data Warehousing clustered environment ETL experience of using Informatica 10.1/9.6.1/9.5 Power Center Client tools - Mapping Designer, Repository manager, Workflow Manager/Monitor and Server tools - Informatica Server, Repository Server manager.
  • Expertise in Data Warehouse/Data mart, File feeds, ODS, OLTP and OLAP implementations teamed with project scope, data modeling, ETL development, System testing, Implementation, production support and User interface support. ffshore-onshore coordination, work closely with development team, work estimate and distribution.
  • Experience in Dimensional Modeling using Star and Snow Flake Schema, Identifying Facts and Dimensions, Physical and logical data modeling using Biztalk, ERwin and ER-Studio.
  • Expertise in working with relational databases such as Oracle 11g/10g, SQL Server 2012/2014/2016 and Teradata.
  • Extensively used ETL methodology for supporting Data Extraction, transformations and loading processing, using Informatica BDE
  • Worked on Exception Handling Mappings for Data Quality, Data Profiling, Data cleansing and data validation by using IDQ.
  • Excellent skills on Oracle ODI, Netezza, OBIEE, Teradata, SQL Server, and DB2 database architecture.
  • Hands on experience in developing Stored Procedures, Functions, Views and Triggers, Complex SQL queries using SQL Server, Oracle SQL and Oracle PL/SQL.
  • Experienced in writing and tuning complex SQL queries, Triggers and Stored procedures in SQL Server, Oracle Exadata, Teradata.
  • Expert in building Data Integration, Data Visualization, Workflow solutions, and ETL solutions for clustered data warehouse using SQL Server Integration Services (SSIS).
  • Experience as Business Intelligence Developer using Microsoft BI framework (SQL server, SAS, SSIS, SSAS, SSRS) in various business domains including Finance, Insurance, and Information Technology.
  • Experience in writing expressions in SSRS and Expert in fine tuning the reports. Created many Drill through and Drill Down reports using SSRS.
  • Experience using Visio and Erwin design tools like Deployment Processes and Adhoc to create Star and Snowflake schemas.
  • Experience procedures, functions in PL/SQL, troubleshooting and performance tuning of PL/SQL scripts.
  • Hands on experience in the ETL processes using SQL Server Integration Services (SSIS) Bulk Copy Program (BCP), and Data Transformation Services (DTS)
  • Experience in creating profiles using Informatica Data Quality Developer and analyst tool.
  • Experience in exporting the IDQ objects as mappings or mapplets from IDQ Developer/Analyst to Power center and use them in Power center.
  • Data Base Testing(ETL), Report Testing, Functionality, E2E and Regression.
  • Proficient in the Integration of various data sources with multiple relational databases like Oracle11g /Oracle10g, MS SQL Server, CRM, OFSAA, DB2, Teradata, VSAM files and Flat Files into the staging area, ODS, Data Warehouse and Data Mart.
  • Maintain data virtualization models in a case tool, such as ERWin, PowerDesigner or ERStudio.
  • Experience in using Automation Scheduling tools like Autosys and Control-M.
  • Hands-on experience across all stages of TFS, Software Development Life Cycle (SDLC) including business requirement analysis, data mapping, build, unit testing, system integration and user acceptance testing.
  • Experience in UNIX shell scripting, FTP, Change Management process and EFT file management in various UNIX environments.
  • Highly motivated to take independent responsibility as well as ability to contribute and be a productive team member with excellent Verbal and Communication Skills and clear understanding of Business procedures.

TECHNICAL SKILLS:

Databases: Oracle 7.x/8i/9i/10g/11g (SQL, PL/SQL, Stored Procedures, Triggers), MS SQL SERVER 2012/2014/2016, DB2/UDB, Teradata, SAP Tables and MS Access.

ETL Tools: Informatica Power Center 10.1/9.6.1/9.5/8.6.1 (Designer, Workflow Manager, Workflow Monitor, Repository manager and SSIS(integration service) Informatica Server),Domian, Power Exchange CDC, Ab-Initio 1.8,Talend.

Data Modeling tools: Erwin 4.0, MS Visio,Informatica Data Virtualization.

Languages/Utilities: SQL, JDBC, PL/SQL, Python, UNIX, Shell scripts,UML, SOAP UI, Perl, Web Services, Java Script, HTML,XML/XSD,.Net,Eclipse,C,C#.

IDE/Tools: Putty, Toad, SQL Developer, SQL Loader, HP Quality center

Operating Systems: UNIX (Sun Solaris, LINUX, HP UNIX, AIX ), Windows NT, Windows XP, Windows 7, 8, 10.

Scheduling Tools: Tidal, AutoSys11, UC 4.

Testing Tools: QTP, WinRunner, LoadRunner,, Unit test, System test, Quality Center, Test Director, Clear test, Clear case.

PROFESSIONAL EXPERIENCE:

Confidential, Alpharetta, GA

Informatica Developer

Responsibilities:

  • Worked extensively with complex mappings using different transformations like Source Qualifiers, Expressions, Filters, Joiners, Routers, Union, Unconnected / Connected Lookups and Aggregators, Stored Procedures and Normalizer transformations.
  • Created ETL mappings using Informatica Power center to extract the data from multiple sources like Flat files, Oracle, XML, (direct and indirect method) transformed based on business requirements and User Interface requirements loaded to Data Warehouse, generated XML.
  • Worked on creating Data base objects such as stored procedures, Views and Tables.
  • Used Oracle sources and generated XML using XML generator transformation.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Identified the Bottlenecks, removed the Bottlenecks and implemented performance tuning logic on targets, sources, mappings, sessions to provide maximum efficiency and performance.
  • Collaborate with Curam software developers and is supportive of developers and testers as they set up their builds for dev/test/prod environments.
  • Responsible for report generation using SQL Server Reporting Services (SSRS), Object Oriented Design (OOD) and Crystal Reports based on business requirements.
  • Finalizing the business rules and then to implement them in IDQ
  • Created mappings, workflows and application using Informatics BDE developer.
  • Developed efficient SSIS packages for processing fact and dimension tables with complex transformations.
  • Used Power Exchange to source copybook definition and then to row test the data from data files, VSAM files.
  • Design and build customizations and configuration extensions on top of the IBM Curam Core Framework
  • Involved in performance tuning of the database which includes index maintenance optimizing SQL statements and monitoring the server
  • Providing a single environment for data integration and data migration with role-based tools that share common metadata using Informatica data virtualization.
  • Responsible for Deploying, Scheduling Jobs, Alerting and Maintaining SSIS packages.
  • Monitoring scheduled QA and Production ETL jobs in CA Work Load Automation .
  • Regularly interact with Business Intelligence leadership on project work status, priority setting and resource allocations.
  • Have worked in fast paced AGILE environment & attended daily scrum meeting.
  • Perform Data Integration activity, Data Quality Management and Data Analysis activities using Lavastorm BRE Analytical Tool.
  • Helped with data profiling, specifying and validating rules (Scorecards), and monitoring data quality using the Informatica Analyst tool.
  • Created files as target using BDE developer from various sources Salesforce, Sql Server, Oracle and Flat files.
  • Worked on the Configure Match and Survivor rules in IDQ
  • Performing Data Quality Checks and cleansing the incoming data feeds as and profiling the source systems data as per business rules using IDQ
  • Customized Unix scripts as required for preprocessing steps and validate input and output data elements along with Biztalk DataStage routines.
  • Created test cases and assisted in UAT testing .
  • Participate in deployment, MVC, system testing, UAT .
  • Created POC for the migration of Lavastorm BRE graphs to Informatica PowerCenter 9.6.1.
  • Set strategy and oversee design and development of EDW staging areas and target tables.
  • Work with support team to define methods for and potentially implement solutions for performance measuring and monitoring of all data movement technologies.

Environment: Informatica Power Center 9.6.1, CA Work Load Automation Tool, Curam, RDBMS (Oracle ODI, SQL Server 2016), SSIS, Flat file, XML, Lotus Notes, C#, IDQ, Power Exchange, Service now, DBCM, Windows 8/7, Java, XML Files, CSV files.

Confidential, New Orleans, LA

ETL (Informatica) Developer

Responsibilities:

  • Understanding and reviewing the functional requirements which, we get from different states with the Business Analyst and signing off the requirement document ation.
  • Migrated data from SQL Server 2012 to the SQL Server 2014 using SSIS as ETL tool.
  • Design and develop PL/SQL packages, stored procedure, tables, views, indexes, and functions; implement best practices to maintain optimal performance.
  • Design, develop, and test Informatica mappings, workflows, worklets, reusable objects, SQL queries, and Shell scripts to implement complex business rules.
  • Developed both one-time and real-time mappings using Power Center 9.6 Power Exchange
  • Maintained stored definitions, transformation rules and targets definitions using Informatica repository Manager .
  • Performed/automated many ETL related tasks including data cleansing, data conversion, and transformations to load Oracle 10G based Data Warehouse and OBIEE.
  • Performing Data Quality Checks and cleansing the incoming data feeds as and profiling the source systems data as per business rules using IDQ
  • Created IDQ mapplets for address cleansing, telephone cleansing and SSN cleansing and used them as Informatica Mapplets in Power Center.
  • Migrated Legacy Oracle and DB2 databases to SQL Server
  • Developed mappings/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center and BDE.
  • Used DataStage manager for importing metadata from repository, new job categories and creating new data elements.
  • Helped Development Team in deploying and testing the application, which uses SQL Server 20012 backend.
  • Extensive work in Biztalk, SSRS, SAS, SSAS, SSIS, MVC, BIDS, BIML, MS SQL Server, SQL Programming and MS Access.
  • Worked extensively on Erwin and ER Studio in several projects in both OLAP and OLTP applications.
  • Duplicated all the non-production databases on Exadata server according the Data Warehouse
  • Enable Agile Business Intelligence (BI) with data virtualization.
  • Developed the Efficient ways of Restartability strategies in case of any SSIS Package failures.
  • User Acceptance, E2E, Multiple Browser, Regression, and Smoke Testing.
  • Validated, debugged old Mappings tested Workflows & Sessions and figured out the better technical solutions. Identified the Bottlenecks in old/new Mappings and tuned them for better Performance.
  • Scheduling the Informatica Cloud Service jobs using Informatica Cloud task scheduler.
  • Expert in creating parameterized reports, Drill down, Drill through, Sub reports, linked reports, Snapshot, Cached, Adhoc reports using SSRS.
  • Understanding ETL requirement specifications to develop HLD & LLD for type-1, SCD Type-II and Type-III mappings and was involved in testing for various data/reports.
  • Designed and developed the T-SQL logic for handling slowly changing dimension table’s load by flagging the record using update strategy for populating the desired.
  • Created profiles on the Source table to find the Data anomalies using IDQ.
  • Migrated Data base from Oracle to SQL Server using OLE DB provider by creating SSIS packages.
  • Developer Shell/Perl, MFT scripts to transfer files using FTP, SFTP, and to automate ETL jobs
  • Migrated Informatica mappings, sessions and workflows from development environment to QA, and checking the developed code into Tortoise SVN for release Exception management.
  • Documented Data Mappings/ Transformations as per B2B the User Interface requirement.
  • Deployed the SQL Server 2012 SSIS packages.

Environment: Informatica Power Center 9.1.0/9.6.1, Oracle 11g/10g RAC, ESP, Putty, Erwin, XML Files, CSV files, SQL, PL/SQL, SSIS, Informatica Data Virtualization, Linux, OBIEE, IBM DataStage 8.5, Unix Shell scripting, Netezza, Ab Initio Data Profiler, Windows 8/7, BIDS, BIML, Informatica IDQ 9.1/9.5, EXTJS, Power Exchange, SQL Server 2014, SSIS/SSRS, Informatica Cloud Toad3.0, T-SQL, Aginity, Cognos, BO BI4.0.

Confidential, Carrollton, TX

ETL Informatica Developer

Responsibilities: -

  • Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business.
  • Designed ETL specification documents and templates for all the projects.
  • Extracted data from Flat files, GreenPlum, DB2, SQL and Oracle ODI to build an Operation Data Source. Applied business logic to load the data into Global Data Warehouse.
  • Extensively worked on Facts and Slowly Changing Dimension (SCD) tables.
  • Used various transformations like Filter, Router, Expression, Lookup (connected and unconnected), Aggregator, Sequence Generator, Update Strategy, Joiner, Normalizer, Sorter and Union to develop robust mappings in the Informatica Designer.
  • Worked on different tasks in Workflow Manager like Sessions, Events raise, Event wait, Decision, E-mail, Command, Worklets, Assignment, Timer and Scheduling of the workflow.
  • Developed various dashboards, used context filters, sets while dealing with huge volume of data in Tableau.
  • Created NZ Load Scripts to load the Flat Files into Netezza Staging tables.
  • Configurator of related Tidal scheduler jobs, data conversion and performed unit, load and partner testing of Cisco systems.
  • Performed Extensive Data Quality checks using Ab InitioData Profiling Tool.
  • Wrote programs in SAS and R to generate reports, creating RTF, HTML listings, tables and reports using SAS/ODS for Ad-Hoc report generation.
  • Designed and Optimized Data Connections, Data Extracts, Schedules for Background Tasks and Incremental refresh for the weekly and monthly dashboard reports on Tableau Server.
  • Used UNIX and shell scripting extensively to enhance the PERL scripts and develop, schedule and support Control M batch jobs to schedule the data migration generation and reporting. The PERL and SHELL scripts invoke the stored procedures for data load, computation and generation of reports.
  • Used PowerCenter Data Virtualization Leverages Web services protocols, including XML, WSDL, and SOAP, REST to access data, domain asset management, metadata, and CRM integration workflows within an SOAP.
  • Involved in Unit, Integration, System, and Performance testing levels.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.

Environment: Informatica Power Center 9.6.1, Oracle 10g, SQL Server 2012, SnapLogic, WinCVS, Informatica Data Virtualization, ERL, SQLServer2008, IBM ISeries (DB2), Tableau 8.2, MS Access, Unix, T-SQL, Windows XP, .Net, Ab Initio Data Profiler, Toad, Cognos 8.4.1, SQL developer.

Confidential, Detroit, Michigan

ETL Tester

Responsibilities: -

  • Tested Complex ETL Mappings and Sessions based on business user requirements and business rules to load data from DB2, source flat files and RDBMS tables to target tables.
  • Experience in data analysis, data integration, data migration, conceptual data modeling or metadata creation .
  • Performed Business Analysis, Data Analysis and Dimensional Data Modeling .
  • Created the test environment for Staging area, loading the Staging area with data from multiple sources.
  • Conducted source-system data analysis to understand current state, quality and availability of existing data.
  • Informatica Data Quality checking, Informatica data validation, data cleansing up data to make accurate reports and analysis for the C-suite execs to use in key decision making.
  • Worked with Data Stage for Data Extraction, MVC, Transformation and Loading (ETL) .
  • Tested all the Data Stage Parallel jobs when extract, transformation and loading of the data using Data Stage in the parallel processing mode.
  • Involved in daily Scrum meetings (Agile Methodology). Also involved in Iteration/Sprint planning meeting to plan the stories that needs to be developed and tested in the upcoming sprint based on the priority and estimated effort.
  • Experience in performance tuning on Teradata SQL Queries and Informatica mappings
  • Written several complex SQL queries for validating Cognos Reports.
  • Worked with business team to system test the reports developed in Cognos.
  • Tested whether the reports developed in Cognos are as per company standards.
  • Involved in testing the XML files and checked whether data is parsed and loaded to staging tables.
  • Expertise in working in Agile Environment (Scrum), Waterfall.
  • Responsible for Data mapping testing by writing complex SQL Queries using WINSQL.
  • Tested the database to check field size validation, check constraints, stored procedures and cross verifying the field size defined within the application with metadata.

Environment: Oracle 11g, 10g, PL/SQL Developer, UML, Ab Initio Data Profiler, Teradata, SQL * Plus, Informatica Power Center 9.x/8.x/7.x, .Net, HP Quality Center, TOAD, ESP, Oracle Report, UNIX, Oracle Report Builder, ERWin Data Modeler.

We'd love your feedback!