We provide IT Staff Augmentation Services!

Senior Etl Developer Resume

5.00/5 (Submit Your Rating)

New Port Beach, CA

SUMMARY

  • Possess 9+ plus years of IT development experience in building Data Integration solutions, BI reporting applications for US Clients
  • Possess hands - onexperience in understanding integration requirements and building data integration solutions using Informatica Power Center 8.1/9.0, Informatica Data Explorer (IDE), Datastage, SSIS, Confidential Data Integrator (DI) v9.2 / v10 and Dell Boomi.
  • Subject Matter Expert (SME) in creating dataintegration maps, processes scripts using ETL Tools (Informatica. Datastage, SSIS, Confidential Data Integrator v9.2 and v10) as per project requirements.
  • Expertise in preparing ETL design documents, Proof of Concepts (POC), creating very detailed design documents (LLD, HLD, Visio Job Linkage and etc.) and support related documents.
  • Expertise in developing workflows, sessions, command tasks, complex mappings and all types transformations with excellent performance technique.
  • Specialized in writing PL/SQL stored procedures for validating data and updating user data in Oracle EBS application database.
  • Experience in developing BI reports using Oracle Discoverer and Tableau
  • Excellent team player and possess very good interpersonal, written and verbal skills.

TECHNICAL SKILLS

ETL Tools: Informatica, SSIS, Confidential Data Integrator v9.2, v10, Dell Boomi

Languages: C#, ASP.Net, VB.NET, Java/J2EE, ASP, AJAX, VB6, C/C++

Framework/Lib/API: Java Struts, JSP

Scripting Language: VBScript, Java Script, RIFL Script

Web Technologies: XML, XSL, XSLT, CSS, HTML5

Databases: SQL Server 2005, 2007, Oracle 10G/11G, SQL, PL SQL

Development Tools: Eclipse, Wing IDE

Testing tools: QTP, Test Link

BI Reporting Tools: Tableau, Oracle Discoverer

Operating System: Windows, UNIX, Linux

PROFESSIONAL EXPERIENCE

Confidential, New Port Beach, CA

Senior ETL Developer

Responsibilities:

  • Developed the data mapping between Bills, Claims, Services, provider files and their individual tables in the staging database.
  • Used Process Designer FTPconnector to get the files form client FTP server and place the files in shared location.
  • Create entries in the control table with the file name, provider name and file type.
  • Created maps for mapping the file to their individual tables.
  • Use Dell Boomi internal function and additional JavaScript to handle special mapping logic and validations.
  • Created and automated process to load the get the files from FTP to Stating db and import the data from staging DB to ID and MD 360 DB.
  • Deploy the created process to the QA server linked to Atom .
  • Performed Unit testing and validated test results with business users.
  • Maintained the process after attaching the process to a Production Atom.
  • Supported user acceptance testing and validating of data in ID 360 and MD360 (provider scoring tools )
  • Implement the maps and processes in production and automated the process to run at scheduled intervals using Dell Boomi Internal Scheduler.

Environment: Dell Boomi, SQL Server and DMCS, SQL Server 2008 and Windows 7

Confidential, Missouri, ST. Louis

ETL Developer

Responsibilities:

  • Developed the data mapping between MYCBO and incoming Payment files
  • Used Process Designer and Web Invoker to make web service calls to DM9 to POST payment data.
  • Use Dell Boomi internal function and additional JavaScript to handle special mapping logic.
  • Performed Unit testing and validated test results with business users.
  • Supported user acceptance testing and validating of data imports into DM9
  • Implement the maps and processes in production and automated the process to run at scheduled intervals using Dell Boomi Internal Scheduler.

Environment: Confidential Data Integrator v9.2, SQL Server and DMCS, SQL Server 2008 and Windows 7

Confidential, Frederick MD

ETL Developer

Responsibilities:

  • Developed the data mapping between DMCS and incoming Payment files
  • Used Process Designer and Web Invoker to make web service calls to DMCS to POST payment to DMCS data.
  • Created RIFL scripts to handle special mapping logic
  • Performed Unit testing and validated test results with business users.
  • Supported user acceptance testing and validating of data imports into DM9
  • Implement the maps and processes in production and automated the process to run at scheduled intervals using Windows scheduler.

Environment: Dell Boomi, SQL Server and DMCS, SQL Server 2008 and Windows 7

Confidential, Chicago IL

ETL Developer

Responsibilities:

  • Developed the data mapping between Salesforce.com and Softrax (ERP)
  • Used Process Designer and Salesforce connector to pull SFDC data and convert into XML files which are placed on Softrax import folder
  • Build process in Boomi using the web services soap client in connector component to extract, transform the data from MS dynamics to flat file xml for importing to Softrax.
  • Developed maps and processes that process using the Webservices connector in the connection execution step in Boomi to get the Sales Order data from Salesforce.
  • Create mappings from Salesforce to create a xml import file for Softrax to process.
  • Use the Business rule component to filter out the invalid orders and unprocessed orders from Salesforce.
  • Configure and setup Atoms, attach those atoms to Test and Production Environment.
  • Performed Unit testing and validated test results with business users.
  • Supported user acceptance testing and validating of data imports into Softrax.
  • Implement the maps and processes in production and automated the process to run at scheduled intervals.

Environment: Dell Boomi, Saleforce.com API, and Softrax XML Utility

Confidential, Los Angeles CA

ETL Developer

Responsibilities:

  • Developed Data Integration Process using Invoker to fetch the Sales Order data from Confidential using the Web Services API provided by Confidential .
  • Developed maps and processes that process using the Webservices connector in the connection execution step in Boomi to get the Sales Order data from Confidential .
  • Use the map execution step to create the mapping between Sales order data and Quick Books
  • Use the Business rule component to filter out the invalid orders and unprocessed orders from Confidential .
  • Insert processed orders in QB using QB Connector in Dell Boomi.
  • Developed integration process to integrate inventory data from QB to Warehousing application.
  • Prepared test plan and performed unit and integration testing
  • Set up a new Test environment and Production Environment to
  • Used Dell Boomi to schedule the process in the Atom management tab.
  • Provided maintenance support by fixing any production issues.

Environment: Dell Boomi, QuickBooks, Windows Scheduler

Confidential, Los Angeles CA

ETL Developer

Responsibilities:

  • Developed Complex Process in Dell Boomi to complete the data flow from Confidential to Microsoft Dynamics GP and back to Confidential .
  • Worked closely with Dell Boomi team to modify their ETL jobs created in Atomsphere.
  • Setting up various ETL Process for extracting data from Confidential (online order management system) using Webservices soap Client connector.
  • Use the Data process execution step and Flow control execution step to process each order individually and use process call to execute sub-process for posting the order to Dynamics.
  • Written DOS scripts (.BAT file) to call the data generation programs at scheduled intervals by scheduling the .BAT file in the Windows scheduler
  • Worked with Dell Boomi team, MS Dynamics GP team and Confidential team to ensure that integration testing is done thoroughly end-to-end.
  • Assisted in implementing the programming changes in production
  • Provided post go-live support by fixing any data files, security / connection issues.

Environment: Dell Boomi, Microsoft Dynamics GP, Confidential, Tableau

Confidential, Vernon CA

ETL Developer

Responsibilities:

  • Understanding the data mapping between Oracle CRMOD and Oracle EBS
  • Developing Process in Informatica Power Center 8.1 to access Oracle CRMOD web services API (using Invoker) and generate the XML data with customer master data.
  • Develop java scripts to validate the XML data and upload them in Oracle EBS database.
  • Written new PL/SQL stored procedures to process the data loaded into Oracle EBS database and schedule the Oracle EBS concurrent programs to process the data into the core tables for creating the new customers.
  • Implement the programs in production and schedule to run every night to automatically create all the customer overnight in Oracle EBS.

Environment: Informatica Power Center 8.1, Oracle EBS & Oracle CRM on-Demand, Tableau

Confidential

Responsibilities:

  • Understanding the data extract requirements from Oracle EBS
  • Written complex SQL and PL/SQL programs to extract and generated data from multiple tables and feed them to Confidential maps to generate .TXT files
  • Written DI process and FTP Invoker to transmit the .TXT files to AR7 FTP Server.
  • Perform unit testing of the DI maps and processes that does the data extract from EBS and .TXT file transfer to AR7 FTP server
  • Schedule the data extract programs in production to run every night to automatically send EBS data to AR7 server on a daily basis.

Environment: PL/SQL, SQL, Oracle EBS, Oracle 10G, Informatica Power Center 8.1

Confidential, Vernon CA

ETL Developer

Responsibilities:

  • Built Data Integration maps and processes to integrate customers, contacts and cases between Salesforce and RightNow in a bidirectional way.
  • New Incidents from RNow are pulled into SF as Cases for Sales/Mktg Team to know the status of customer support.
  • Also, accounts and contacts are sent from SF to RNow.
  • Development of Informatica mappings for customers, contacts and cases modules as per the Clients requirements.
  • Handled Lookup Tables to meet complex integration requirements
  • Developed Scripts to implement special validation logic given in the data mapping.
  • Created new customized lead generating processes in Informatica to send email to lead.
  • Scheduled the Informatica maps and processes on Data Cloud to execute the data integration jobs at regular intervals.
  • Providing level 1 support on the integration to business users by answering user questions and fix problems.

Environment: Informatica Power Center 8.1, Salesforce, RightNow.

Confidential, Austin TX

ETL Developer

Responsibilities:

  • Modify existing Java programs to in corporate the enhancements given in the design specifications.
  • Prepare test cases and performed unit and system testing of the changes and turn it over to Confidential team for acceptance testing
  • Assist the deployment of changes in production environment.
  • Prepare documentation of the changes and handed over to Confidential technical team.

Confidential

Responsibilities:

  • Set up the test environment for testing the ODBC connectors for SQL Server, Oracle by installing the respective DBMS and created test databases
  • Populated the test databases with test data for testing
  • Prepared test cases for each of the connectors
  • Create data integration maps and processes using DIv10 for each test case
  • Execute the data integration maps and processes using sample data and ensure that the output data matches the expected results
  • Capture the test results in the test plan and update the Test Link tool for test case status
  • For GUI and performance testing, used QTP to automatic the testing.
  • For testing the application connectors such as SFDC, MS CRM, setup the test instance of the cloud applications and used for testing

Confidential

Responsibilities:

  • Read the test plan to understand the test outline and components that needs to be tested.
  • Prepared test cases for each of the components and reviewed with the team lead
  • Create data integration maps and processes using DIv10 for each test case
  • Execute the data integration maps and processes using sample data and ensure that the output data matches the expected results
  • Capture the test results in the test plan and update the TestLink tool for test case status
  • For performance testing, QTP is used to generate automatic test scripts which are executed repeatedly with large volume of data.
  • Prepared documentation of the test maps/processes and QTP scripts and delivered to Confidential team for review and acceptance.

Environment: Confidential DI v10,QTP, Test Link (Test Results Capturing Tool), QTP, Test Link, SQL Server, Oracle, XML, Java, JSP, Oracle.

Confidential

ETL Developer

Responsibilities:

  • Developing .NET based web application programs per design specifications
  • Creating database tables in SQL Server per design specifications and generating test data
  • Writing SQL queries that are embedded into the .NET programs to build adapters and datasets for processing the data sequentially per the business requirements
  • Setting up the IIS web server to configure the web application
  • Perform unit and system testing

Environment: XML, Finnacle, Confidential Data Integrator v9, ASP.NET, ADO.NET, SQL Server, MS Visual Studio. PL/SQL

We'd love your feedback!