We provide IT Staff Augmentation Services!

Sr. Database And Informatica (etl) Developer Resume

5.00/5 (Submit Your Rating)

Tempe, ArizonA

SUMMARY

  • 10 years of IT experience includes Analysis, Design, Development and Maintenance and 7 years of data warehousing experience using Informatica ETL (Extraction, Transformation and Loading Tool) Power center / Power mart, Power Exchange.
  • Strong Experience in working with ETL Informatica ( 9.5.1/9.1/8.6.1 )which includescomponents Informatica PowerCenter Designer, Workflow manager, Workflow monitor, Informatica server and Repository Manager.
  • Solid experience in Dimensional Data modeling, StarSchema/Snowflake modeling, FACT & Dimensions tables, Physical & logical data modeling, ERWIN 3.x, Oracle Designer, Data Integrator.
  • Extensive experience with Shell Scripting in the UNIX Environment. Used UNIX scripting and Scheduled PMCMD to interact with Informatica Server.
  • Extensive experience in creation of ETL mappings, mapplets, mapping wizards using Informatica Power Center to move information from multiple sources into a common target area such as Data Marts and Data Warehouse using connected/unconnected lookups/procedures, update strategies Expressions, Aggregator Transformations.
  • Hands on experience in tuning mappings, identifying and resolving performance bottlenecks in various levels like sources, targets, mappings, and sessions.
  • Expertise in developing SQL and PL/SQL codes through various Procedures/Functions, Packages, Cursors and Triggers to implement the business logics of database.
  • Extensively worked with Different source and Target data like Oracle 11g, 10g, DB2, VSAM, COBOL, Flat files, Teradata, Netezza TwinFin3/6/Skimmer, XML data.
  • Work experience with Informatica Data Quality and Data Profiler Tools.
  • Good experience in Unit Test, System Integration Test and User Acceptance Test.
  • Designed and developed efficient Reconciliation, Error handling methods and implemented throughout the mappings.
  • Good Knowledge of Partitioning Data and session partitions in Informatica.
  • Knowledge of Business Intelligence tools like Informatica Power Analyzer, Cognos 7.0/6.0, Business Objects, OBIEE (Web - Intelligence 2.5, Designer 5.0, and Developer Suite & Set Analyzer 2.0)
  • Proven Experience in Full Life Cycle Implementation of Data warehouses.
  • Strong in Database experience using Oracle 11g/10g/9i/8.x DB2 8.0/7.0 control Center, MS SQL Server 2008, Sybase 12.5/12.x, MS Access 7.0/2000, and XML.
  • Excellent verbal, written, interpersonal and communication skills to lead teams and interact with users and team members to understand and meet their business & functional requirements.

TECHNICAL SKILLS

ETL Tools: InformaticaPowerCenter 9.5.1/9.1/8.6.1 , Informatica Power Mart 6.2/5.1/4.7, Trillium, Informatica Power Exchange, Informatica data quality (IDQ).

Data Modeling: Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, Fact and Dimensional Tables, Physical and Logical Data Modeling.

Business Intelligence: Business Objects XIR2, OBIEE

RDBMS: Oracle 11g/10g/9i/8i/7.3, PL/SQL, SQL*Plus, MS SQL Server 2008, 2005, DB2 UDB 7.1, Teradata, Netezza, MySQL, MS Access.

Programming Skills: Unix Shell Scripting, PL/SQL, T- SQL, C, Visual Basic 6.0/5.0, XML, XSD, XBRL, Java.

Modeling Tool: Erwin 4.1, Embarcadero, MS Visio.

Operating Systems: UNIX, Windows, Linux.

Scheduling Tool: Control-M, CA Scheduler, Informatica Scheduler, Autosys, Windows Scheduler, Cron Tab.

PROFESSIONAL EXPERIENCE

Confidential, Tempe, Arizona

Sr. Database and Informatica (ETL) Developer

Responsibilities:

  • Worked closely with business analysts to understand and document business needs for decision support data.
  • Created the ETL performance expectations document based on the source data profile results.
  • Captured the data volumes, upserts/truncate and load strategies etc in Integration design document.
  • Incorporated the refresh strategy, maintaining the historical data, archiving strategies for the source flat file, Audit balance and Control (ABC) etc in Integration design document.
  • Created the technical architecture (Hardware and Software) that will support ETL.
  • Configured Informatica Power Center GRID on Linux platform.
  • Assigned master and worker nodes to GRID in Informatica platform.
  • Created the Informatica data quality plans, created rules, applied Rules to IDQ plans and incorporated the plans as mapplets in Informatica Power Center.
  • Developed ETL procedures to transform the data in the intermediate tables according to the business rules and functionality requirements.
  • Created High Level and Low Level design document and ETL standards document.
  • Involved in Extraction, Transformation and Loading (ETL) Process.
  • Installed and configured Informatica 9.5.1 HF3 on Red Hat platform.
  • Wrote shell script to take repository backup on a weekly basis and archiving the 30 day old files on Red Hat.
  • Created the Visio diagram
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager.
  • Worked extensively on Autosys using the CA workload center and JIL Checker.
  • Scheduled Informatica jobs using Autosys.
  • Created dependencies in Autosys, inserted/updated jobs Autosys on CA Workload Center.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2008.
  • Solved T-SQL performance issues using Query Analyzer.
  • Optimized SQL queries, sub queries for SSRS reports.
  • Created the SSRS reports with multiple parameters.
  • Modified the data sets and data sources for SSRS reports.
  • Retrieved data from Oracle EBS and loaded into SQL Server data Warehouse.
  • Worked with the Oracle EBS tables like GL CODE COMBINATIONS, GL LEDGER, GL PERIODS, GL JE SOURCES TL, AP CHECKS ALL, AP INVOICE ALL, PO HEADERS ALL, PO LINES ALL, RA CUSTOMER TRX ALL, SO LINES INTERFACE ALL etc.
  • Involved in unit testing, Integration testing and User acceptance testing of the mappings.
  • Performance tuned Informatica session for large data files by increasing block size, data cache size, sequence buffer length and target based commit interval
  • Developed various T-SQL stored procedures, functions and packages.
  • Developed database objects such as SSIS Packages, Tables, Triggers, and Indexes using T-SQL, SQL Analyzer and Enterprise Manager
  • Developed SSIS packages and migrated from Dev to Test and then to Production environment.
  • Perform T-SQL tuning and optimizing queries for Reports which take longer time in execution SQL Server 2008.
  • Optimized the T-SQL queries and converted PL-SQL code to T-SQL.
  • Standardized the T-SQL stored procedures per the organizations standards.
  • Applied try/catch blocks to the T-SQL procedures.
  • Used merge statement in T-SQL for upserts into the target tables.
  • Made changes to SSRS financial reports with user’s input.
  • Installed/configured Teradata Power Connect for Fast Export for Informatica.
  • Involved heavily in creating customized Informatica data quality plans.
  • Worked with address and names data quality.
  • Used Proactive monitoring for daily/weekly Informatica jobs.
  • Customized the proactive monitoring dashboard with the Informatica repository tables like OPB SESS TASK LOG etc.
  • Resolved Skewness in Teradata
  • Created jobs for Informatica data replication fast clone to get data from oracle and load it into Teradata.
  • Wrote BTEQ scripts of Teradata extensively.
  • Installed configured Amazon redshift cloud data integration application for faster data queries.
  • Created JDBC, ODBC connections in Amazon redshift from the connect client tab of the console.
  • Automated the administrative tasks of Amazon redshift like provision, monitoring etc.
  • Aware of the columnar storage, data compression, zone maps of Amazon redshift.
  • Extracted data from complex XML hierarchical schemas for transformation and load into Teradata and vice versa.
  • Resolve syntax differences from Teradata to Oracle and documented it.
  • Scheduled the workflows to pull data from the source databases at weekly intervals.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
  • Created the FTP connection from Tidal to the source file server.
  • Retrieved data from XML, Excel, and CSV files.
  • Archived the source files with timestamp using Tidal Scheduler.
  • Performance tuning on sources, targets, mappings and database.
  • Worked with the other team such reporting to investigate and fix the data issues coming out of the warehouse environment.
  • Worked as production support SME to investigate and troubleshoot data issues coming out of Weekly and Monthly Processes.
  • Worked with business to provide them daily production status report in the form of issues, their priority and business impact along with recommended short term and long term solution.

Environment: Informatica Power Center 9.5.1 HF3/9.1.0 HF1 (Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Amazon RedShift cloud data integrator 10, Business Objects, Erwin 7.2, Oracle 11g, Oracle Exadata, XML, Sales Force dot com (SFDC), SQL Server 2008 R2/2012, DB2 8.0/7.0, Team Foundation Server, SQL Server Management studio, Sun Solaris, Windows XP, Control M.

Confidential, Phoenix, AZ

Sr. Database and Informatica (ETL) Developer

Responsibilities:

  • Created technical specification documents like system design and detail design documents for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables.
  • Designed and developed ETL process using Informatica tool to load data from wide range of sources such as Sql Server 2005, AS400 (DB2), Flat files and Oracle etc.
  • Responsible for migrating the folders, mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to other Environment.
  • Based on the Business logic, Developed various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Normalizer, Aggregator, Joiner, SQL transformations, XML Transformation in the mapping.
  • Used connected, unconnected lookup, static and dynamic cache to implement business logic and improve the performance.
  • Worked on creating the business objects Universe’s objects such as class, objects, context, join and table.
  • Created derived tables in business objects.
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Extensively worked onETLperformance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Developed PL/SQL Stored Procedures, Views and Triggers to implement complex business logics to extract, cleanse the data, transform and load the data in to the Oracle database.
  • Used Partitions in Sessions to improve the performance of the database load time.
  • Registration and Extraction Oracle CDC tables using Power exchange navigator.
  • Imported Power exchange Register tables to Implement CDC on Informatica.
  • Created sessions and used pre and post session properties to execute scripts and to handle errors.
  • Running Informatica MDM Batch Group jobs and doing analysis on Rejects records.
  • Created UNIX scripts to Schedule Informatica Workflows through PMCMD command.
  • Extensively worked on workflow manager and workflow monitor to create, schedule, monitor workflows, work lets, sessions, tasks etc.
  • Used Email task, Control task, Link and command tasks in Informatica Workflows.
  • Created Informatica Mappings and Workflows to Run Informatica MDM stage, Base Objects and Match Merge Jobs.

Environment: Informatica9.5.1/9.1(PowerCenter,Designer,WorkflowManager,RepositoryManager,Monitor), Power Exchange, Oracle11g,AS400(DB2), Flat files, PL/SQL, SQL*Loader, TOAD, Sql Developer, Power Exchange, UNIX Sun Solaris, Informatica MDM, Autosys scheduler (JIL), Business Objects 4.1., SVN

Confidential, Dallas, TX

Sr. Informatica (ETL) Developer

Responsibilities:

  • Created technical specification documents like system design and detail design documents for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables.
  • Designed and developed ETL process using Informatica tool to load data from wide range of sources such as Sql Server 2005, AS400 (DB2), Flat files and Oracle etc.
  • Responsible for migrating the folders, mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to other Environment.
  • Based on the Business logic, Developed various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Normalizer, Aggregator, Joiner, SQL transformations, XML Transformation in the mapping.
  • Used connected, unconnected lookup, static and dynamic cache to implement business logic and improve the performance.
  • Worked on creating the business objects Universe’s objects such as class, objects, context, join and table.
  • Made changes to the BO GL reports based on user inputs.
  • Worked on BO security settings for the users.
  • Created derived tables in business objects.
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Extensively worked onETLperformance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Developed PL/SQL Stored Procedures, Views and Triggers to implement complex business logics to extract, cleanse the data, transform and load the data in to the Oracle database.
  • Used Partitions in Sessions to improve the performance of the database load time.
  • Registration and Extraction Oracle CDC tables using Power exchange navigator.
  • Imported Power exchange Register tables to Implement CDC on Informatica.
  • Created sessions and used pre and post session properties to execute scripts and to handle errors.
  • Running Informatica MDM Batch Group jobs and doing analysis on Rejects records.
  • Created UNIX scripts to Schedule Informatica Workflows through PMCMD command.
  • Extensively worked on workflow manager and workflow monitor to create, schedule, monitor workflows, work lets, sessions, tasks etc.
  • Used Email task, Control task, Link and command tasks in Informatica Workflows.
  • Created Informatica Mappings and Workflows to Run Informatica MDM stage, Base Objects and Match Merge Jobs.

Environment: Informatica9.5.1/9.1(PowerCenter,Designer,WorkflowManager,RepositoryManager,Monitor), Power Exchange, Oracle11g,AS400(DB2), Flat files, PL/SQL, SQL*Loader, TOAD, Sql Developer, Power Exchange, UNIX Sun Solaris, Informatica MDM, Informatica Scheduler, Business Objects 4.1.

Confidential, Norfolk, VA

Sr.ETL Developer

Responsibilities:

  • Created technical specification documents like system design and detail design documents for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables.
  • Designed and developed ETL process using Informatica tool to load data from wide range of sources such as Flat Files and Oracle Database.
  • Responsible for migrating the folders or mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to other Environment.
  • Based on the Business logic, Developed various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Normalizer, Aggregator, Joiner, SQL transformations, XML Transformation in the mapping.
  • Used connected, unconnected lookup, static and dynamic cache to implement business logic and improve the performance.
  • Extract files and manually transferred to Informatica server using FTP scripts developed using Python.
  • Used PowerCenter jobs to read Mainframe files using a source plug-in called PowerExchange and write them to Oracle Staging Database.
  • Parsed records by Power Exchange using data maps that are created using COBOL layouts based on segment names.
  • Used Power Center and UNIX scripts to determine the number of thread and the sequence. We call it as program directory (listing of batches and jobs).
  • Created the Program Directory is the listing the Informatica jobs along with parameters such as folder name, source file name etc.
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data.
  • Extensively worked onETLperformance tuning for tune the data load, worked with DBAs for SQL query tuning etc.
  • Used Partitions in Sessions to improve the performance of the database load time.
  • Tested the existing mappings and redesigned the mappings to improve the performance and the efficiency of the design logic.
  • Created sessions and used pre and post session properties to execute scripts and to handle errors.

Environment: Informatica9.1/8.6(PowerCenter,Designer,WorkflowManager,RepositoryManager,Monitor),PowerExchannge,Oracle11g,MainfrmaeFiles(Cobalfiles),Flatfiles,PL/SQL,SQL*Loader,TOAD,SqlDeveloper,UNIXSun Solaris, ERWIN Data Modeling tool, AutoSys

Confidential, TX

Sr. Informatica Developer

Responsibilities:

  • Involved in the Study of the business logic and coordinate with the client to gather user requirements.
  • Created technical specification documents like system design and detail design documents for the development of Informatica Extraction, Transformation and Loading (ETL) mappings to load data into various tables and defining ETL standards.
  • Designed and developed ETL process using Informatica tool to load data from wide range of sources such as Oracle, DB2, SQL Server, XML and Flat Files.
  • Based on the logic, Developed various mappings & mapplets to load data from various sources using different transformations like Source Qualifier, Filter Expression, Lookup, Router, Update strategy, Sorter, Lookup, Aggregator, Joiner, SQL transformations in the mapping.
  • Created mappings, which include the Error Handling Logic being implemented to create an error, ok flags and an error message depending on the source data and the lookup outputs.
  • Developed and Implemented Informatica parameter files to filter the daily data from the source system.
  • Created mappings in the designer to implement Type 2 SCD.
  • Fine tuned the mappings by analyzing data flow and Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Lookup, Joiner and Aggregator transformations.
  • Responsible for migrating the folders or mappings and sessions from development to test environment and Created Migration Documents to move the code from one Environment to other Environment.
  • Performs insert, update, delete, and upsert operations for large volume incremental loads using TeraData MultiLoad,FastLoad
  • Developed Informatica parameter files to filter the daily source data.
  • Responsible for loading data into warehouse using Oracle Loader.
  • Worked with Stored Procedure Transformation for time zone conversions.
  • Created UNIX scripts to transfer (FTP) the files from Windows server to the specified location in UNIX server.
  • Created Unix scripts to automate the activities like start, stop, and abort the informatica workflows by using PMCMD command in it.
  • Created batch scripts on windows server to archive and delete the files and run these scripts using Autosys.
  • Created Oracle Stored Procedure to implement complex business logic for good performance and called from Informatica using Stored Procedure Transformation
  • Used various Oracle Index techniques like B*tree, Bitmap index to improve the query performance and created scripts to update the table statistics for better explain plan.
  • Created Materialized view to summarize the data based on the user requirement to improve the Business Objects report queries performance.
  • Responsible for loading data into warehouse using Oracle Loader for history data
  • Extensively involved in the analysis and tuning of the application code (SQL).
  • Created Test cases for Unit Test, System Integration Test and UAT to check the data.

Environment: Informatica 8.6 (Power Center, Designer, Workflow Manager, Administrator and Repository Manager), Oracle11g,Flatfiles,TeraData Fast load,Multi Load,T-pump loads, PL/SQL, SQL*Loader, TOAD, Business Objects 6.5, UNIX Sun Solaris, ERWIN Data Modeling tool, AutoSys, ClearCase.

Confidential, Pittsburg, PA

Informatica/ETL Developer

Responsibilities:

  • Involved in translating business requirements to integrate into existing Datamart design.
  • Developed ETL jobs to extract information from Enterprise Data Warehouse.
  • Extensively used ETL to load data from different relational databases, XML and flat files.
  • Used ETL, Informatica Repository Manager to create repositories and users and to give permissions to users.
  • Debugged the mappings extensively, hard coding the test data ids to test the logics going instance by instance.
  • Performed various transformations, aggregations and ranking routines on the data to be stored in the application reporting mart.
  • Handle the Migration process from Development, Test and Production Environments.
  • Implemented Type 2 slowly changing dimensions to maintain dimension history and Tuned the Mappings for Optimum Performance.
  • Used ETL, Informatica Designer to design mappings and coded it using reusable mapplets.
  • Developed workflow sequences to control the execution sequence of various jobs and to email support personnel.
  • Involved in unit testing and documenting the jobs and work flows.
  • Set Standards for Naming Conventions and Best Practices for Informatica Mapping Development.
  • Used database objects like Sequence generators and Stored Procedures for accomplishing the Complex logical situations.
  • Created various UNIX shell scripts for Job automation of data loads.
  • Created mappings, which include the Error Handling Logic being implemented to create an error, ok flags and an error message depending on the source data and the lookup outputs.
  • Extensively involved in the analysis and tuning of the application code (SQL).

Environment: Informatica 7.1 (Power Center, Designer, Workflow Manager, Administrator Repository Manager), Oracle9i, DB2, PL/SQL, SQL*Loader, TOAD, UNIX Sun Solaris, Teradata, Erwin, MS Visio, Windows 2000, UNIX HP-UX AutoSys.

Confidential, TX

Sr. ETL Developer (Informatica)

Responsibilities:

  • Management to forecast their markets accurately, identify the factors that most impact Involved in all phases of SDLC from requirement, design, development and testing
  • Each segment's profitability. Warehouse is supposed to assist in defining strategy for Extensively involved in requirement analysis and createdETLmapping design document
  • Each product segment. The project is focused on developing various details and summary Extensively Designed, Developed, Tested complex Informatica mappings and mapplets reports that will prove helpful for the sales and marketing teams. This application to load data from external flat files and other databases.
  • Developed Data Migration and Cleansing rules for the Integration Architecture (OLTP, ODS, DW).
  • DevelopLogical and Physical data modelsthat capture current state/future state data elements and data flows using Erwin.
  • Extensively Worked on Informatica tools such as Source Analyzer, Warehouse Designer, Transformation Designer, Mapplet Designer and Mapping Designer
  • Extensively used all the transformations like source qualifier, aggregator, filter, joiner, Sorter, Lookup, Update Strategy, Router and Sequence Generator
  • Extensively worked on the Repository Manager to create/modify/delete users/group/roles.
  • Created reusable transformations and used in various mappings
  • Involved in running the loads to the data warehouse and data mart involving different environments.
  • Responsible for loading data into warehouse using Oracle Loader.
  • Extensively worked on workflow manager and workflow monitor to create, schedule, monitor workflows, work lets, sessions, tasks etc.
  • Extensively worked onETLperformance tuning for tune the data load, worked with DBA's for SQL query tuning etc.
  • Responsible for definition, development and testing of processes/programs necessary to extract data from client's operational databases, Transform and cleanse data, and Load it into data marts.
  • Developed Informatica parameter files to filter the daily source data.
  • Extensively usedPL/SQLprogramming in backend and front-end functions, procedures, packages to implement business rules.
  • Integrated various sources into the Staging area in Data warehouse
  • Provide technical support to Quality Assurance team andProductiongroup.
  • Support Project Manager in estimating tasks.
  • Providedproductionsupport to schedule and executeproductionbatch jobs and analyzed log files
  • Involved in all phases of Data quality assurance.
  • Developed shell scripts to automate the data loading process and to cleanse the flat file inputs

Environment: Informatica Power Center 8.5,Oracle 10g, PL/SQL, SQL, SQL*PLUS, SQL*LOADER, ERWIN 3.5,Flat files, Teradata, SQL Server2005,DB2,Cobal,FlatFiles,Unix Shell Scripting using AWK/ SED, HP-UX 10.20/NT.

Confidential, Houston, TX

ETL/Oracle Developer

Responsibilities:

  • Was involved in the Development of Mappings, Sessions, and Workflows along with creating various Transformations within Informatica Mappings as per the Spec.
  • Extensively used Expression, Joiner for heterogeneous sources, look-up, filter, aggregate, and update strategy transformations to transform data before loading into the target tables. Created mappings using Reusable Transformations and Mapplets to create error free objects
  • Was involved in the creation of Shell Scripts for ETL Process Automation and other Specific processes based on requirements. Scheduled the scripts in UNIX using the crontab command
  • Was involved in the creation of Test Cases using SQL and executing Unit Test Scripts and publishing Test Results.
  • Tuned the mappings different logics to provide maximum efficiency and performance complete using various methods.
  • Tuned the source database SQL statements according to the business requirement and the dataflow in between theInformaticatransformations.
  • Created tasks like Decision Task, Event Wait, event raise, email Task and batches in the Task Developer and Scheduled the workflows in the Workflow Designer respectively provided with Workflow Manager during testing.
  • Worked with DBA to create Materialized views, sequences, views, and other DDL statements, complex SQL queries which involve joins, sub queries and optimizing them.
  • Was involved in attending Daily and Weekly Project Status Meetings.
  • Was involved in taking part in Project Technical Discussions.

Environment: Oracle 8i, SQL, PL/SQLErwin 4.1,Micro strategy,Informatica Power Center 5.x,ANSI SQL, PL/SQL,Windows 2000.

We'd love your feedback!