We provide IT Staff Augmentation Services!

Senior Data Integration Developer Resume

4.00/5 (Submit Your Rating)

DenveR

SUMMARY:

  • Experienced professional with over 5+ years of progressive expertise in analyzing the requirements, application development, testing and a comprehensive background in Database development, Data Warehousing and Data Integration.
  • Over 5 years of design and development of ETL methodology for supporting data transformations and processing in a corporate wide ETL solution using SSIS and Informatica Power Center platform.
  • Have clear understanding of Business Intelligence and Data Warehousing Concepts with emphasis on ETL and Data Warehouse Development Life Cycle.
  • Strong technical exposure with good degree of competence in business systems like Retail, Insurance and Healthcare.
  • Implemented complex business logic in ETL process using SSIS and Informatica.
  • Implemented ETL Framework in SSIS and Informatica.
  • Involved in all facets of development including Analysis, Design, Development and Testing of data from various sources into Data Warehouses and Data Marts.
  • Experience loading data from legacy systems into Oracle Data Warehouse using Informatica.
  • Extensive database experience using SQL Server 2008/2012/2014, Oracle Exadata/11g/10g, Teradata, Netezza.
  • Knowledge in PL/SQL, T - SQL, Stored Procedures, Unix shell scripting.
  • Experience coding dynamic SSIS packages to load multiple files using the same code
  • Experience performance tuning SQL queries on SQL Server, Oracle, Teradata etc.
  • Excellent communication and interpersonal skills, experience working in teams and coordinating offshore resource based

TECHNICAL SKILLS:

ETL Tools: SSIS 2008/2012, Informatica PowerCenter v8.6.1/9.1, BTEQ

Databases: Oracle 10g/9i/8i, MS SQL Server 2005/2008/2012/ PDW, Teradata, Netezza.

Database Tools: SQL Plus, Toad, PLSQL Developer, SQL Assistant, Aginity

Others: PVCS, Control-M, MS - Visio, Tidal, MS Office, SharePoint, WordPress.

Languages: C, C++, SQL, Shell Scripting Ksh, Matlab, bascis of PHP and HTML.

Operating Systems: Windows 9x/NT/2000/XP, UNIX (AIX), LINUX.

PROFESSIONAL EXPERIENCE:

Confidential Denver

Senior Data Integration Developer

Responsibilities:

  • Involved in the full System Development Life Cycle (SDLC) and responsible for all part of requirement gathering, planning, designing, developing and testing.
  • Sourcing data from Netezza, staging into SQL Server using SSIS.
  • Designing and Coding complex Netezza stored procedures into Informatica Mappings.
  • Developed stored procedures to load/update tables with billions of records using partitions
  • Developing stored procedures to load ODS.
  • Creating ETL design based on the requirements.
  • Developing complex T-SQL queries.
  • Developing backend process to support UI project.
  • Developed standard ETL framework in Informatica and SSIS.
  • CDC on Teradata using BTEQ Scripts.
  • Involved in developing solution to generate the input for CDI process.
  • Developing mload, fload, fexp scripts to load/extract flat files into/from Teradata.
  • Developing SQL scripts to perform various ETL work on Teradata.
  • Developed several SSIS packages to transform and load the data into SQL Server.
  • Developed complex SSIS packages to load data into flat files and Excel spread sheets.
  • Developing and tuning complex stored procedure.
  • Developing shell scripts for flat file validation steps.
  • Conducting unit test and helping testers with creating system test cases.
  • Coded several SQL scripts in Oracle, SQL server and Teradata to generate the input files from several sources for CDI process.
  • Developing Visio diagrams for data flow and file transfer processes.
  • Developed Tidal jobs and SQL Server Agent jobs to automate the ETL process.
  • Involved in transferring files from Client servers to local server, using Sterling, Connect Direct etc.
  • Involved in building the data warehouse for marketing purposes.
  • Creating and maintaining DTLM for the project.
  • Involved in analyzing the quality of data provided by the clients.
  • Involved in linkage analysis, to determine the relational integrity of the data.
  • Developing scripts to create input for the Merkle hosted CDI process.
  • Setting up file transfers between Merkle and Clients.
  • Performance tuning the SQL scripts on Teradata.

Environment: Teradata, SQL Assistant, SQL Server, Informatica, SSIS, HP-Quality Center, Unix/Linux, Tidal, Knowledge Link, File Bridge.

Confidential, Denver, CO

ETL Developer

Responsibilities:

  • Involved in the full System Development Life Cycle (SDLC) and responsible for all part of requirement gathering, planning, designing, developing, testing and implementing.
  • Extensively used Informatica Client tools - Source Analyzer, Mapping Designer, Informatica Repository Manager and Informatica Workflow Manager.
  • Performance tuning SQL Scripts on Oracle 11g and Exadata.
  • Extensively used ETL (Informatica) to load data from source (Flat files) to target Oracle database.
  • Worked on complete life cycle from Extraction, Transformation and Loading of data using Informatica.
  • Developed various mappings using Mapping Designer and worked with Aggregator, Lookup (connected and unconnected), Filter, Router, Joiner, Source Qualifier, Expression, Stored Procedure, Sorter, Sequence Generator, Rank and Web Service Consumer transformations.
  • Created and Configured Workflows and Sessions to transport the data to target warehouse tables using Informatica Workflow Manager.
  • Identified performance issues in existing sources, targets and mappings by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Involved in Unit and Integrating testing of Informatica Sessions, Batches and the Target Data.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Actively involved in production support. Implemented fixes/solutions to issues/tickets raised by user community.
  • Created and unit tested several onetime scripts to fix production data issues.
  • Performance tuned multiple scripts using partition elimination techniques, and improving update performance.
  • Used various performance enhancement techniques to enhance the performance of the sessions and workflows.
  • Performance tuning on sources, targets mappings and SQL (Optimization) tuning.
  • Conducted Performance tuning of application by modifying the SQL statements and using Explain Plan and TOAD Software.
  • Worked closely with testing team and helped in creating test cases, setting up testing environment and testing the Informatica 8.6.1 mappings, workflows and Oracle scripts.
  • Used designer debugger to test the data flow and fix the mappings.
  • Used Informatica Version Control for checking in all versions of the objects used in creating the mappings, workflows to keep track of the changes in the development, test and production environment.
  • Used repository compare tool to compare the code versions between Production, Test and Dev.
  • HP-Quality Center for creating and maintaining the history of defects during Integration Testing.

Environment: Informatica Power Center 8.6.1, Oracle11g, SQL, Exadata, Toad, Unix, Control-M, PVCS, HP-Quality Center.

Confidential

FEA Developer

Responsibilities:

  • Writing proposals for the projects.
  • Attend client meetings to understand the scope of the project.
  • Developing Mathematical models of real life structures.
  • Solving complex mathematical equations using software’s like Mathcad, Excel etc.
  • Developing Matlab code to analyze various structures.
  • Developing FE codes to the the real life scenarios in various software packages such as ABAQUS and SV Office
  • Developing code to model real life scenarios using various C, C++ and Visual Basic based software such as SAP2000, STAAD Pro, Auto CAD, SV Office and Geo Studio.
  • Developing test cases to unit test the results obtained from the analysis.
  • Conducting parametric study to obtain the variation in the results.
  • Presenting the results obtained from the research to the end users.
  • Developing design guidelines for the end users.
  • Writing reports and presenting the results to the clients.
  • Writing C programs to analyze the engineering properties of the elements.
  • Documenting handouts for the Engineers, to make the software’s easily understandable.
  • Instructing undergraduate students on various design and analysis software packages.

Environment: C, C++, SAP 2000, Matlab, Oracle 10g, SQL Server, ABAQUS, SV Office, MS Office, Auto CAD, Revit, Geo Studio.

We'd love your feedback!